[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 13355 1727096149.95807: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-And executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 13355 1727096149.96709: Added group all to inventory 13355 1727096149.96712: Added group ungrouped to inventory 13355 1727096149.96716: Group all now contains ungrouped 13355 1727096149.96719: Examining possible inventory source: /tmp/network-EuO/inventory.yml 13355 1727096150.15183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 13355 1727096150.15243: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 13355 1727096150.15265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 13355 1727096150.15330: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 13355 1727096150.15418: Loaded config def from plugin (inventory/script) 13355 1727096150.15423: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 13355 1727096150.15464: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 13355 1727096150.15551: Loaded config def from plugin (inventory/yaml) 13355 1727096150.15553: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 13355 1727096150.15641: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 13355 1727096150.16265: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 13355 1727096150.16271: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 13355 1727096150.16274: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 13355 1727096150.16281: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 13355 1727096150.16285: Loading data from /tmp/network-EuO/inventory.yml 13355 1727096150.16352: /tmp/network-EuO/inventory.yml was not parsable by auto 13355 1727096150.16610: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 13355 1727096150.16653: Loading data from /tmp/network-EuO/inventory.yml 13355 1727096150.16738: group all already in inventory 13355 1727096150.16744: set inventory_file for managed_node1 13355 1727096150.16749: set inventory_dir for managed_node1 13355 1727096150.16750: Added host managed_node1 to inventory 13355 1727096150.16752: Added host managed_node1 to group all 13355 1727096150.16753: set ansible_host for managed_node1 13355 1727096150.16754: set ansible_ssh_extra_args for managed_node1 13355 1727096150.16757: set inventory_file for managed_node2 13355 1727096150.16759: set inventory_dir for managed_node2 13355 1727096150.16760: Added host managed_node2 to inventory 13355 1727096150.16762: Added host managed_node2 to group all 13355 1727096150.16762: set ansible_host for managed_node2 13355 1727096150.16763: set ansible_ssh_extra_args for managed_node2 13355 1727096150.16766: set inventory_file for managed_node3 13355 1727096150.17031: set inventory_dir for managed_node3 13355 1727096150.17033: Added host managed_node3 to inventory 13355 1727096150.17034: Added host managed_node3 to group all 13355 1727096150.17035: set ansible_host for managed_node3 13355 1727096150.17036: set ansible_ssh_extra_args for managed_node3 13355 1727096150.17039: Reconcile groups and hosts in inventory. 13355 1727096150.17043: Group ungrouped now contains managed_node1 13355 1727096150.17045: Group ungrouped now contains managed_node2 13355 1727096150.17047: Group ungrouped now contains managed_node3 13355 1727096150.17129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 13355 1727096150.17251: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 13355 1727096150.17314: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 13355 1727096150.17348: Loaded config def from plugin (vars/host_group_vars) 13355 1727096150.17350: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 13355 1727096150.17357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 13355 1727096150.17365: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 13355 1727096150.17418: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 13355 1727096150.17745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096150.17839: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 13355 1727096150.17881: Loaded config def from plugin (connection/local) 13355 1727096150.17884: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 13355 1727096150.18515: Loaded config def from plugin (connection/paramiko_ssh) 13355 1727096150.18518: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 13355 1727096150.19338: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13355 1727096150.19382: Loaded config def from plugin (connection/psrp) 13355 1727096150.19386: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 13355 1727096150.20938: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13355 1727096150.20973: Loaded config def from plugin (connection/ssh) 13355 1727096150.20976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 13355 1727096150.26742: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13355 1727096150.26787: Loaded config def from plugin (connection/winrm) 13355 1727096150.26791: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 13355 1727096150.26823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 13355 1727096150.27325: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 13355 1727096150.27397: Loaded config def from plugin (shell/cmd) 13355 1727096150.27399: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 13355 1727096150.27426: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 13355 1727096150.27908: Loaded config def from plugin (shell/powershell) 13355 1727096150.27910: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 13355 1727096150.27989: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 13355 1727096150.28578: Loaded config def from plugin (shell/sh) 13355 1727096150.28581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 13355 1727096150.28616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 13355 1727096150.29150: Loaded config def from plugin (become/runas) 13355 1727096150.29153: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 13355 1727096150.29755: Loaded config def from plugin (become/su) 13355 1727096150.29758: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 13355 1727096150.30335: Loaded config def from plugin (become/sudo) 13355 1727096150.30337: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 13355 1727096150.30380: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 13355 1727096150.31546: in VariableManager get_vars() 13355 1727096150.31575: done with get_vars() 13355 1727096150.32128: trying /usr/local/lib/python3.12/site-packages/ansible/modules 13355 1727096150.37721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 13355 1727096150.37908: in VariableManager get_vars() 13355 1727096150.38181: done with get_vars() 13355 1727096150.38184: variable 'playbook_dir' from source: magic vars 13355 1727096150.38185: variable 'ansible_playbook_python' from source: magic vars 13355 1727096150.38186: variable 'ansible_config_file' from source: magic vars 13355 1727096150.38187: variable 'groups' from source: magic vars 13355 1727096150.38187: variable 'omit' from source: magic vars 13355 1727096150.38188: variable 'ansible_version' from source: magic vars 13355 1727096150.38189: variable 'ansible_check_mode' from source: magic vars 13355 1727096150.38189: variable 'ansible_diff_mode' from source: magic vars 13355 1727096150.38190: variable 'ansible_forks' from source: magic vars 13355 1727096150.38191: variable 'ansible_inventory_sources' from source: magic vars 13355 1727096150.38191: variable 'ansible_skip_tags' from source: magic vars 13355 1727096150.38192: variable 'ansible_limit' from source: magic vars 13355 1727096150.38193: variable 'ansible_run_tags' from source: magic vars 13355 1727096150.38194: variable 'ansible_verbosity' from source: magic vars 13355 1727096150.38234: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml 13355 1727096150.40520: in VariableManager get_vars() 13355 1727096150.40540: done with get_vars() 13355 1727096150.40550: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13355 1727096150.42592: in VariableManager get_vars() 13355 1727096150.42610: done with get_vars() 13355 1727096150.42621: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13355 1727096150.42819: in VariableManager get_vars() 13355 1727096150.42836: done with get_vars() 13355 1727096150.43197: in VariableManager get_vars() 13355 1727096150.43211: done with get_vars() 13355 1727096150.43477: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13355 1727096150.43551: in VariableManager get_vars() 13355 1727096150.43589: done with get_vars() 13355 1727096150.44233: in VariableManager get_vars() 13355 1727096150.44249: done with get_vars() 13355 1727096150.44254: variable 'omit' from source: magic vars 13355 1727096150.44275: variable 'omit' from source: magic vars 13355 1727096150.44309: in VariableManager get_vars() 13355 1727096150.44321: done with get_vars() 13355 1727096150.44484: in VariableManager get_vars() 13355 1727096150.44498: done with get_vars() 13355 1727096150.44535: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13355 1727096150.45116: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13355 1727096150.45405: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13355 1727096150.51185: in VariableManager get_vars() 13355 1727096150.51209: done with get_vars() 13355 1727096150.52251: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 13355 1727096150.52634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096150.54803: in VariableManager get_vars() 13355 1727096150.54823: done with get_vars() 13355 1727096150.54833: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13355 1727096150.54937: in VariableManager get_vars() 13355 1727096150.54956: done with get_vars() 13355 1727096150.55094: in VariableManager get_vars() 13355 1727096150.55132: done with get_vars() 13355 1727096150.55459: in VariableManager get_vars() 13355 1727096150.55495: done with get_vars() 13355 1727096150.55502: variable 'omit' from source: magic vars 13355 1727096150.55514: variable 'omit' from source: magic vars 13355 1727096150.55708: variable 'controller_profile' from source: play vars 13355 1727096150.55766: in VariableManager get_vars() 13355 1727096150.55782: done with get_vars() 13355 1727096150.55803: in VariableManager get_vars() 13355 1727096150.55817: done with get_vars() 13355 1727096150.55853: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13355 1727096150.55993: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13355 1727096150.56076: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13355 1727096150.56494: in VariableManager get_vars() 13355 1727096150.56525: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096150.59310: in VariableManager get_vars() 13355 1727096150.59350: done with get_vars() 13355 1727096150.59357: variable 'omit' from source: magic vars 13355 1727096150.59371: variable 'omit' from source: magic vars 13355 1727096150.59402: in VariableManager get_vars() 13355 1727096150.59420: done with get_vars() 13355 1727096150.59453: in VariableManager get_vars() 13355 1727096150.59474: done with get_vars() 13355 1727096150.59521: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13355 1727096150.59862: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13355 1727096150.59961: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13355 1727096150.60708: in VariableManager get_vars() 13355 1727096150.60737: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096150.63376: in VariableManager get_vars() 13355 1727096150.63405: done with get_vars() 13355 1727096150.63411: variable 'omit' from source: magic vars 13355 1727096150.63423: variable 'omit' from source: magic vars 13355 1727096150.63510: in VariableManager get_vars() 13355 1727096150.63562: done with get_vars() 13355 1727096150.63594: in VariableManager get_vars() 13355 1727096150.63616: done with get_vars() 13355 1727096150.63651: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13355 1727096150.63795: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13355 1727096150.63876: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13355 1727096150.64299: in VariableManager get_vars() 13355 1727096150.64324: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096150.67036: in VariableManager get_vars() 13355 1727096150.67069: done with get_vars() 13355 1727096150.67075: variable 'omit' from source: magic vars 13355 1727096150.67107: variable 'omit' from source: magic vars 13355 1727096150.67145: in VariableManager get_vars() 13355 1727096150.67168: done with get_vars() 13355 1727096150.67189: in VariableManager get_vars() 13355 1727096150.67218: done with get_vars() 13355 1727096150.67248: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13355 1727096150.67390: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13355 1727096150.67475: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13355 1727096150.68000: in VariableManager get_vars() 13355 1727096150.68027: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096150.70437: in VariableManager get_vars() 13355 1727096150.70479: done with get_vars() 13355 1727096150.70490: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13355 1727096150.71332: in VariableManager get_vars() 13355 1727096150.71363: done with get_vars() 13355 1727096150.71437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 13355 1727096150.71452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 13355 1727096150.71703: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 13355 1727096150.71888: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 13355 1727096150.71891: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 13355 1727096150.71921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 13355 1727096150.71954: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 13355 1727096150.72376: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 13355 1727096150.72445: Loaded config def from plugin (callback/default) 13355 1727096150.72448: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13355 1727096150.73655: Loaded config def from plugin (callback/junit) 13355 1727096150.73659: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13355 1727096150.73715: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 13355 1727096150.73786: Loaded config def from plugin (callback/minimal) 13355 1727096150.73789: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13355 1727096150.73830: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 13355 1727096150.73893: Loaded config def from plugin (callback/tree) 13355 1727096150.73895: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 13355 1727096150.74014: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 13355 1727096150.74017: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-And/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_removal_nm.yml ******************************************** 2 plays in /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml 13355 1727096150.74046: in VariableManager get_vars() 13355 1727096150.74064: done with get_vars() 13355 1727096150.74072: in VariableManager get_vars() 13355 1727096150.74081: done with get_vars() 13355 1727096150.74090: variable 'omit' from source: magic vars 13355 1727096150.74133: in VariableManager get_vars() 13355 1727096150.74274: done with get_vars() 13355 1727096150.74297: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_removal.yml' with nm as provider] ***** 13355 1727096150.75502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 13355 1727096150.75694: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 13355 1727096150.75727: getting the remaining hosts for this loop 13355 1727096150.75728: done getting the remaining hosts for this loop 13355 1727096150.75732: getting the next task for host managed_node3 13355 1727096150.75736: done getting next task for host managed_node3 13355 1727096150.75738: ^ task is: TASK: Gathering Facts 13355 1727096150.75739: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096150.75742: getting variables 13355 1727096150.75743: in VariableManager get_vars() 13355 1727096150.75757: Calling all_inventory to load vars for managed_node3 13355 1727096150.75759: Calling groups_inventory to load vars for managed_node3 13355 1727096150.75762: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096150.75777: Calling all_plugins_play to load vars for managed_node3 13355 1727096150.75788: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096150.75792: Calling groups_plugins_play to load vars for managed_node3 13355 1727096150.75825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096150.75884: done with get_vars() 13355 1727096150.75891: done getting variables 13355 1727096150.75960: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 Monday 23 September 2024 08:55:50 -0400 (0:00:00.020) 0:00:00.020 ****** 13355 1727096150.75986: entering _queue_task() for managed_node3/gather_facts 13355 1727096150.75987: Creating lock for gather_facts 13355 1727096150.76573: worker is 1 (out of 1 available) 13355 1727096150.76580: exiting _queue_task() for managed_node3/gather_facts 13355 1727096150.76592: done queuing things up, now waiting for results queue to drain 13355 1727096150.76593: waiting for pending results... 13355 1727096150.76727: running TaskExecutor() for managed_node3/TASK: Gathering Facts 13355 1727096150.76738: in run() - task 0afff68d-5257-c514-593f-0000000001bc 13355 1727096150.76763: variable 'ansible_search_path' from source: unknown 13355 1727096150.76804: calling self._execute() 13355 1727096150.76880: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096150.76891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096150.76903: variable 'omit' from source: magic vars 13355 1727096150.77011: variable 'omit' from source: magic vars 13355 1727096150.77047: variable 'omit' from source: magic vars 13355 1727096150.77093: variable 'omit' from source: magic vars 13355 1727096150.77139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096150.77192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096150.77220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096150.77246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096150.77371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096150.77375: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096150.77378: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096150.77381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096150.77436: Set connection var ansible_shell_executable to /bin/sh 13355 1727096150.77446: Set connection var ansible_shell_type to sh 13355 1727096150.77458: Set connection var ansible_pipelining to False 13355 1727096150.77467: Set connection var ansible_connection to ssh 13355 1727096150.77480: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096150.77488: Set connection var ansible_timeout to 10 13355 1727096150.77517: variable 'ansible_shell_executable' from source: unknown 13355 1727096150.77583: variable 'ansible_connection' from source: unknown 13355 1727096150.77587: variable 'ansible_module_compression' from source: unknown 13355 1727096150.77589: variable 'ansible_shell_type' from source: unknown 13355 1727096150.77591: variable 'ansible_shell_executable' from source: unknown 13355 1727096150.77594: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096150.77596: variable 'ansible_pipelining' from source: unknown 13355 1727096150.77598: variable 'ansible_timeout' from source: unknown 13355 1727096150.77600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096150.77751: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096150.77773: variable 'omit' from source: magic vars 13355 1727096150.77783: starting attempt loop 13355 1727096150.77789: running the handler 13355 1727096150.77813: variable 'ansible_facts' from source: unknown 13355 1727096150.77837: _low_level_execute_command(): starting 13355 1727096150.77851: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096150.78693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096150.78722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096150.78740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096150.78900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096150.78970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096150.80780: stdout chunk (state=3): >>>/root <<< 13355 1727096150.80856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096150.80873: stdout chunk (state=3): >>><<< 13355 1727096150.80888: stderr chunk (state=3): >>><<< 13355 1727096150.80998: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096150.81103: _low_level_execute_command(): starting 13355 1727096150.81106: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739 `" && echo ansible-tmp-1727096150.8100708-13415-125038212720739="` echo /root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739 `" ) && sleep 0' 13355 1727096150.82291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096150.82302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096150.82311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096150.82323: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096150.82331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096150.82777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096150.82781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096150.82783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096150.82785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096150.84738: stdout chunk (state=3): >>>ansible-tmp-1727096150.8100708-13415-125038212720739=/root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739 <<< 13355 1727096150.84875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096150.84880: stdout chunk (state=3): >>><<< 13355 1727096150.84886: stderr chunk (state=3): >>><<< 13355 1727096150.84907: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096150.8100708-13415-125038212720739=/root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096150.84950: variable 'ansible_module_compression' from source: unknown 13355 1727096150.85006: ANSIBALLZ: Using generic lock for ansible.legacy.setup 13355 1727096150.85010: ANSIBALLZ: Acquiring lock 13355 1727096150.85013: ANSIBALLZ: Lock acquired: 140397099650992 13355 1727096150.85015: ANSIBALLZ: Creating module 13355 1727096151.30763: ANSIBALLZ: Writing module into payload 13355 1727096151.30925: ANSIBALLZ: Writing module 13355 1727096151.30952: ANSIBALLZ: Renaming module 13355 1727096151.30970: ANSIBALLZ: Done creating module 13355 1727096151.31017: variable 'ansible_facts' from source: unknown 13355 1727096151.31070: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096151.31073: _low_level_execute_command(): starting 13355 1727096151.31076: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 13355 1727096151.31724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096151.31746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096151.31784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096151.31797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096151.31863: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096151.31903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096151.31928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096151.31959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096151.32177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096151.33788: stdout chunk (state=3): >>>PLATFORM <<< 13355 1727096151.33858: stdout chunk (state=3): >>>Linux <<< 13355 1727096151.33886: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 13355 1727096151.34079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096151.34082: stdout chunk (state=3): >>><<< 13355 1727096151.34084: stderr chunk (state=3): >>><<< 13355 1727096151.34107: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096151.34123 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 13355 1727096151.34177: _low_level_execute_command(): starting 13355 1727096151.34188: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 13355 1727096151.34501: Sending initial data 13355 1727096151.34504: Sent initial data (1181 bytes) 13355 1727096151.34841: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096151.34855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096151.34881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096151.34900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096151.34917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096151.34952: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096151.35055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096151.35116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096151.35179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096151.38792: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 13355 1727096151.39287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096151.39300: stdout chunk (state=3): >>><<< 13355 1727096151.39320: stderr chunk (state=3): >>><<< 13355 1727096151.39340: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096151.39518: variable 'ansible_facts' from source: unknown 13355 1727096151.39522: variable 'ansible_facts' from source: unknown 13355 1727096151.39525: variable 'ansible_module_compression' from source: unknown 13355 1727096151.39681: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13355 1727096151.39717: variable 'ansible_facts' from source: unknown 13355 1727096151.40374: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/AnsiballZ_setup.py 13355 1727096151.40936: Sending initial data 13355 1727096151.40940: Sent initial data (154 bytes) 13355 1727096151.42189: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096151.42423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096151.42448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096151.42878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096151.44537: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096151.44595: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096151.44659: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpubz9ijj2 /root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/AnsiballZ_setup.py <<< 13355 1727096151.44718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpubz9ijj2" to remote "/root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/AnsiballZ_setup.py" <<< 13355 1727096151.47534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096151.47720: stdout chunk (state=3): >>><<< 13355 1727096151.47724: stderr chunk (state=3): >>><<< 13355 1727096151.47726: done transferring module to remote 13355 1727096151.47727: _low_level_execute_command(): starting 13355 1727096151.47729: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/ /root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/AnsiballZ_setup.py && sleep 0' 13355 1727096151.48407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096151.48448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096151.48454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096151.48538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096151.48579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096151.48604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096151.48619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096151.48762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096151.50636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096151.50655: stderr chunk (state=3): >>><<< 13355 1727096151.50877: stdout chunk (state=3): >>><<< 13355 1727096151.50882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096151.50884: _low_level_execute_command(): starting 13355 1727096151.50886: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/AnsiballZ_setup.py && sleep 0' 13355 1727096151.52129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096151.52325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096151.52461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096151.52973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096151.55109: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 13355 1727096151.55143: stdout chunk (state=3): >>>import '_io' # <<< 13355 1727096151.55150: stdout chunk (state=3): >>>import 'marshal' # <<< 13355 1727096151.55210: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 13355 1727096151.55375: stdout chunk (state=3): >>>import 'time' # <<< 13355 1727096151.55392: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 13355 1727096151.55440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95777b84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577787b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 13355 1727096151.55448: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95777baa50> <<< 13355 1727096151.55465: stdout chunk (state=3): >>>import '_signal' # <<< 13355 1727096151.55488: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 13355 1727096151.55535: stdout chunk (state=3): >>>import 'io' # <<< 13355 1727096151.55690: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # <<< 13355 1727096151.55735: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 13355 1727096151.55763: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 13355 1727096151.55770: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 13355 1727096151.56176: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577569130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577569fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13355 1727096151.56392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13355 1727096151.56412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 13355 1727096151.56480: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a7dd0> <<< 13355 1727096151.56483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 13355 1727096151.56516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # <<< 13355 1727096151.56530: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a7fe0> <<< 13355 1727096151.56547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 13355 1727096151.56720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 13355 1727096151.56730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775df800> <<< 13355 1727096151.56737: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 13355 1727096151.56739: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775dfe90> <<< 13355 1727096151.56755: stdout chunk (state=3): >>>import '_collections' # <<< 13355 1727096151.57023: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775bfaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775bd1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a4f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 13355 1727096151.57031: stdout chunk (state=3): >>>import '_sre' # <<< 13355 1727096151.57034: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 13355 1727096151.57037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 13355 1727096151.57094: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 13355 1727096151.57099: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775ff6e0> <<< 13355 1727096151.57104: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775fe300> <<< 13355 1727096151.57307: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775be060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a6e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95776347a0> <<< 13355 1727096151.57310: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a4200> <<< 13355 1727096151.57316: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577634c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577634b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577634ef0> <<< 13355 1727096151.57322: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a2d20> <<< 13355 1727096151.57558: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95776355b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577635280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95776364b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 13355 1727096151.57576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 13355 1727096151.57659: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957764c680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f957764dd30> <<< 13355 1727096151.57662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 13355 1727096151.57695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 13355 1727096151.57699: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 13355 1727096151.57737: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957764ebd0> <<< 13355 1727096151.57765: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f957764f230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957764e120> <<< 13355 1727096151.58093: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f957764fcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957764f3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577636450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577347b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577370650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773703b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577370680> <<< 13355 1727096151.58111: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 13355 1727096151.58216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096151.58320: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577370fb0> <<< 13355 1727096151.58484: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577371910> <<< 13355 1727096151.58487: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577370860> <<< 13355 1727096151.58492: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577345d60> <<< 13355 1727096151.58584: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577372cc0> <<< 13355 1727096151.58735: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773717f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577636ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 13355 1727096151.58858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957739f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 13355 1727096151.58861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096151.58893: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 13355 1727096151.58896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 13355 1727096151.59072: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773c3410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 13355 1727096151.59187: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95774201a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13355 1727096151.59197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13355 1727096151.59397: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577422900> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95774202c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773ed1c0> <<< 13355 1727096151.59503: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d2d2e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773c2210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577373bf0> <<< 13355 1727096151.59623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 13355 1727096151.59997: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f95773c2570> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_a73o6oq5/ansible_ansible.legacy.setup_payload.zip' <<< 13355 1727096151.60001: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.60038: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.60062: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 13355 1727096151.60078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13355 1727096151.60204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13355 1727096151.60228: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d8f050> <<< 13355 1727096151.60245: stdout chunk (state=3): >>>import '_typing' # <<< 13355 1727096151.60428: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d6df40> <<< 13355 1727096151.60483: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d6d0a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 13355 1727096151.60501: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.60537: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 13355 1727096151.62053: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.63249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d8cf20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096151.63364: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 13355 1727096151.63504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576dbe9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576dbe750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576dbe060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576dbe930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577370440> import 'atexit' # <<< 13355 1727096151.63520: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576dbf6b0> <<< 13355 1727096151.63592: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096151.63618: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576dbf830> <<< 13355 1727096151.63621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13355 1727096151.63804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576dbfd70> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c29b20> <<< 13355 1727096151.63808: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c2b710> <<< 13355 1727096151.63829: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 13355 1727096151.63884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 13355 1727096151.63898: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2c0e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 13355 1727096151.63930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 13355 1727096151.63959: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2d250> <<< 13355 1727096151.63962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 13355 1727096151.64008: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 13355 1727096151.64117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2fd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95773efdd0> <<< 13355 1727096151.64149: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2e000> <<< 13355 1727096151.64155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13355 1727096151.64261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13355 1727096151.64562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c37b90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c36660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c363c0> <<< 13355 1727096151.64565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13355 1727096151.64583: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c36930> <<< 13355 1727096151.64605: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2e4e0> <<< 13355 1727096151.64631: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096151.64636: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c7bd70> <<< 13355 1727096151.64726: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c7bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 13355 1727096151.64729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 13355 1727096151.64790: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c7d9d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c7d7c0> <<< 13355 1727096151.64870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 13355 1727096151.65042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c7ff20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c7e090> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c83740> <<< 13355 1727096151.65260: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c80110> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c84500> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096151.65424: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c84950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c84a10> <<< 13355 1727096151.65427: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c7c110> <<< 13355 1727096151.65431: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576b10170> <<< 13355 1727096151.65975: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576b115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c86900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c87cb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c86540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 13355 1727096151.66030: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.66387: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.66722: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.67335: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 13355 1727096151.67339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096151.67482: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576b157c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 13355 1727096151.67499: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b165d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b117f0> <<< 13355 1727096151.67587: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.67608: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 13355 1727096151.67786: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.67941: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 13355 1727096151.67944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 13355 1727096151.67946: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b16360> <<< 13355 1727096151.67949: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.68419: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.69040: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.69044: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 13355 1727096151.69122: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 13355 1727096151.69176: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.69287: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 13355 1727096151.69301: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.69340: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.69379: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 13355 1727096151.69443: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.69617: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.69851: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13355 1727096151.69989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 13355 1727096151.70010: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b17830> <<< 13355 1727096151.70013: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.70091: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.70159: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 13355 1727096151.70201: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 13355 1727096151.70302: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.70318: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 13355 1727096151.70329: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.70373: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.70427: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.70530: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13355 1727096151.70575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096151.70642: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576b22030> <<< 13355 1727096151.70675: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b1f3b0> <<< 13355 1727096151.70717: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 13355 1727096151.70720: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.70857: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.70892: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.70955: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 13355 1727096151.70958: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096151.70972: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13355 1727096151.71145: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13355 1727096151.71148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c0aae0> <<< 13355 1727096151.71188: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576cfe7b0> <<< 13355 1727096151.71291: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b22210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b21df0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 13355 1727096151.71440: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 13355 1727096151.71443: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.71446: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.71448: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 13355 1727096151.71450: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.71727: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.71770: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 13355 1727096151.71774: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.71944: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.72056: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 13355 1727096151.72273: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.72337: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.72381: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.72602: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb6510> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 13355 1727096151.72645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 13355 1727096151.72825: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576774170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95767744a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576ba6c30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb6ff0> <<< 13355 1727096151.72867: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb4bf0> <<< 13355 1727096151.72872: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb4860> <<< 13355 1727096151.72875: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 13355 1727096151.73295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576777440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576776cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576776ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576776120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95767775c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 13355 1727096151.73309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 13355 1727096151.73374: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95767d60c0> <<< 13355 1727096151.73382: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95767d40e0> <<< 13355 1727096151.73412: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb48c0> <<< 13355 1727096151.73419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 13355 1727096151.73461: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.73465: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.73469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 13355 1727096151.73472: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.73531: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.73639: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 13355 1727096151.73650: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.73741: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 13355 1727096151.73765: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.73793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 13355 1727096151.73835: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.73853: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.73959: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 13355 1727096151.73962: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.74004: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 13355 1727096151.74007: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.74163: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.74177: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.74248: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 13355 1727096151.74256: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 13355 1727096151.74262: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.74818: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.75196: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 13355 1727096151.75260: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.75305: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.75378: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.75381: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 13355 1727096151.75676: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 13355 1727096151.75701: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.75788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 13355 1727096151.75801: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.75892: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 13355 1727096151.75908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 13355 1727096151.76013: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95767d7980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 13355 1727096151.76092: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95767d6d80> <<< 13355 1727096151.76118: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 13355 1727096151.76254: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.76258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 13355 1727096151.76260: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.76338: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.76446: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 13355 1727096151.76497: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.76574: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 13355 1727096151.76586: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.76619: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.76674: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 13355 1727096151.76980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576812390> <<< 13355 1727096151.77072: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95768021b0> <<< 13355 1727096151.77077: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 13355 1727096151.77080: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.77204: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 13355 1727096151.77426: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.77476: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.77620: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 13355 1727096151.77642: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.77765: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 13355 1727096151.77771: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.77981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576825e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576825a90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.78008: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 13355 1727096151.78012: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.78184: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.78323: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 13355 1727096151.78327: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.78520: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.78571: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.78616: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 13355 1727096151.78620: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 13355 1727096151.78645: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.78884: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.78983: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 13355 1727096151.79078: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.79197: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 13355 1727096151.79240: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.79326: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.79866: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.80404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 13355 1727096151.80486: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.80515: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.80630: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 13355 1727096151.80728: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.80829: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 13355 1727096151.80843: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.81062: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.81293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 13355 1727096151.81296: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 13355 1727096151.81299: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 13355 1727096151.81499: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.81532: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.81718: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.82073: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 13355 1727096151.82128: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.82331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.82381: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 13355 1727096151.82477: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.82496: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 13355 1727096151.82778: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.83042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 13355 1727096151.83177: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.83181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 13355 1727096151.83183: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.83208: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.83289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 13355 1727096151.83296: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.83371: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.83390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 13355 1727096151.83404: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.84000: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 13355 1727096151.84004: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.84006: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.84009: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 13355 1727096151.84045: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.84094: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 13355 1727096151.84112: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.84496: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 13355 1727096151.84787: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096151.84862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 13355 1727096151.84879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 13355 1727096151.85005: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.85112: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 13355 1727096151.85145: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096151.85346: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 13355 1727096151.85555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95765be4b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95765bcc80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95765b4c80> <<< 13355 1727096151.98154: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 13355 1727096151.98160: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 13355 1727096151.98179: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95765bfad0> <<< 13355 1727096151.98203: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 13355 1727096151.98216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 13355 1727096151.98383: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576604920> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576605cd0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576605730> <<< 13355 1727096151.98747: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 13355 1727096152.23612: stdout chunk (state=3): >>> <<< 13355 1727096152.23648: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2988, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 543, "free": 2988}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 295, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261806120960, "block_size": 4096, "block_total": 65519099, "block_available": 63917510, "block_used": 1601589, "inode_total": 131070960, "inode_available": 131029185, "inode_used": 41775, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa<<< 13355 1727096152.23681: stdout chunk (state=3): >>>", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.44189453125, "5m": 0.48193359375, "15m": 0.23828125}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixe<<< 13355 1727096152.23764: stdout chunk (state=3): >>>d]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "52", "epoch": "1727096152", "epoch_int": "1727096152", "date": "2024-09-23", "time": "08:55:52", "iso8601_micro": "2024-09-23T12:55:52.231226Z", "iso8601": "2024-09-23T12:55:52Z", "iso8601_basic": "20240923T085552231226", "iso8601_basic_short": "20240923T085552", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13355 1727096152.24313: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 13355 1727096152.24425: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 13355 1727096152.24528: stdout chunk (state=3): >>># cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user <<< 13355 1727096152.24636: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 13355 1727096152.25004: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13355 1727096152.25007: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 13355 1727096152.25009: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 13355 1727096152.25011: stdout chunk (state=3): >>># destroy _blake2 <<< 13355 1727096152.25013: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 13355 1727096152.25014: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 13355 1727096152.25098: stdout chunk (state=3): >>># destroy ntpath # destroy importlib <<< 13355 1727096152.25110: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select <<< 13355 1727096152.25190: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 13355 1727096152.25328: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 13355 1727096152.25340: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 13355 1727096152.25343: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 13355 1727096152.25612: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 13355 1727096152.25618: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 13355 1727096152.25731: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 13355 1727096152.25735: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13355 1727096152.25839: stdout chunk (state=3): >>># destroy sys.monitoring <<< 13355 1727096152.26017: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 13355 1727096152.26021: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 13355 1727096152.26024: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 13355 1727096152.26026: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 13355 1727096152.26129: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 13355 1727096152.26198: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 13355 1727096152.26297: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 13355 1727096152.26700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096152.26703: stdout chunk (state=3): >>><<< 13355 1727096152.26705: stderr chunk (state=3): >>><<< 13355 1727096152.27111: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95777b84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577787b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95777baa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577569130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577569fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a7dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a7fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775df800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775dfe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775bfaa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775bd1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a4f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775ff6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775fe300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775be060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a6e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95776347a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a4200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577634c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577634b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577634ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95775a2d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95776355b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577635280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95776364b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957764c680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f957764dd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957764ebd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f957764f230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957764e120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f957764fcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957764f3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577636450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577347b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577370650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773703b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577370680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577370fb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9577371910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577370860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577345d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577372cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773717f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577636ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f957739f020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773c3410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95774201a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577422900> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95774202c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773ed1c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d2d2e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95773c2210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577373bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f95773c2570> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_a73o6oq5/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d8f050> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d6df40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d6d0a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576d8cf20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576dbe9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576dbe750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576dbe060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576dbe930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9577370440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576dbf6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576dbf830> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576dbfd70> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c29b20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c2b710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2c0e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2d250> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2fd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95773efdd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2e000> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c37b90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c36660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c363c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c36930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c2e4e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c7bd70> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c7bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c7d9d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c7d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c7ff20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c7e090> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c83740> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c80110> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c84500> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c84950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c84a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c7c110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576b10170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576b115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c86900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576c87cb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c86540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576b157c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b165d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b117f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b16360> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b17830> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576b22030> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b1f3b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576c0aae0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576cfe7b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b22210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576b21df0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb6510> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576774170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95767744a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576ba6c30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb6ff0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb4bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb4860> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576777440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576776cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576776ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576776120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95767775c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95767d60c0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95767d40e0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576bb48c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95767d7980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95767d6d80> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576812390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95768021b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9576825e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576825a90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f95765be4b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95765bcc80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95765b4c80> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f95765bfad0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576604920> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576605cd0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9576605730> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2988, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 543, "free": 2988}, "nocache": {"free": 3303, "used": 228}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 295, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261806120960, "block_size": 4096, "block_total": 65519099, "block_available": 63917510, "block_used": 1601589, "inode_total": 131070960, "inode_available": 131029185, "inode_used": 41775, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.44189453125, "5m": 0.48193359375, "15m": 0.23828125}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "52", "epoch": "1727096152", "epoch_int": "1727096152", "date": "2024-09-23", "time": "08:55:52", "iso8601_micro": "2024-09-23T12:55:52.231226Z", "iso8601": "2024-09-23T12:55:52Z", "iso8601_basic": "20240923T085552231226", "iso8601_basic_short": "20240923T085552", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 13355 1727096152.29449: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096152.29455: _low_level_execute_command(): starting 13355 1727096152.29458: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096150.8100708-13415-125038212720739/ > /dev/null 2>&1 && sleep 0' 13355 1727096152.29912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096152.30180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096152.30194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096152.30213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096152.30277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096152.32860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096152.32975: stderr chunk (state=3): >>><<< 13355 1727096152.32986: stdout chunk (state=3): >>><<< 13355 1727096152.33017: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096152.33039: handler run complete 13355 1727096152.33197: variable 'ansible_facts' from source: unknown 13355 1727096152.33401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096152.35178: variable 'ansible_facts' from source: unknown 13355 1727096152.35271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096152.35407: attempt loop complete, returning result 13355 1727096152.35417: _execute() done 13355 1727096152.35425: dumping result to json 13355 1727096152.35462: done dumping result, returning 13355 1727096152.35477: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0afff68d-5257-c514-593f-0000000001bc] 13355 1727096152.35487: sending task result for task 0afff68d-5257-c514-593f-0000000001bc 13355 1727096152.36072: done sending task result for task 0afff68d-5257-c514-593f-0000000001bc 13355 1727096152.36081: WORKER PROCESS EXITING ok: [managed_node3] 13355 1727096152.36184: no more pending results, returning what we have 13355 1727096152.36188: results queue empty 13355 1727096152.36188: checking for any_errors_fatal 13355 1727096152.36190: done checking for any_errors_fatal 13355 1727096152.36190: checking for max_fail_percentage 13355 1727096152.36192: done checking for max_fail_percentage 13355 1727096152.36193: checking to see if all hosts have failed and the running result is not ok 13355 1727096152.36193: done checking to see if all hosts have failed 13355 1727096152.36194: getting the remaining hosts for this loop 13355 1727096152.36196: done getting the remaining hosts for this loop 13355 1727096152.36200: getting the next task for host managed_node3 13355 1727096152.36206: done getting next task for host managed_node3 13355 1727096152.36207: ^ task is: TASK: meta (flush_handlers) 13355 1727096152.36209: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096152.36213: getting variables 13355 1727096152.36215: in VariableManager get_vars() 13355 1727096152.36239: Calling all_inventory to load vars for managed_node3 13355 1727096152.36242: Calling groups_inventory to load vars for managed_node3 13355 1727096152.36245: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096152.36255: Calling all_plugins_play to load vars for managed_node3 13355 1727096152.36258: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096152.36261: Calling groups_plugins_play to load vars for managed_node3 13355 1727096152.36458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096152.36669: done with get_vars() 13355 1727096152.36680: done getting variables 13355 1727096152.36752: in VariableManager get_vars() 13355 1727096152.36761: Calling all_inventory to load vars for managed_node3 13355 1727096152.36764: Calling groups_inventory to load vars for managed_node3 13355 1727096152.36766: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096152.36772: Calling all_plugins_play to load vars for managed_node3 13355 1727096152.36774: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096152.36777: Calling groups_plugins_play to load vars for managed_node3 13355 1727096152.36919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096152.37288: done with get_vars() 13355 1727096152.37301: done queuing things up, now waiting for results queue to drain 13355 1727096152.37303: results queue empty 13355 1727096152.37304: checking for any_errors_fatal 13355 1727096152.37306: done checking for any_errors_fatal 13355 1727096152.37307: checking for max_fail_percentage 13355 1727096152.37308: done checking for max_fail_percentage 13355 1727096152.37309: checking to see if all hosts have failed and the running result is not ok 13355 1727096152.37316: done checking to see if all hosts have failed 13355 1727096152.37317: getting the remaining hosts for this loop 13355 1727096152.37318: done getting the remaining hosts for this loop 13355 1727096152.37321: getting the next task for host managed_node3 13355 1727096152.37325: done getting next task for host managed_node3 13355 1727096152.37327: ^ task is: TASK: Include the task 'el_repo_setup.yml' 13355 1727096152.37329: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096152.37331: getting variables 13355 1727096152.37332: in VariableManager get_vars() 13355 1727096152.37339: Calling all_inventory to load vars for managed_node3 13355 1727096152.37341: Calling groups_inventory to load vars for managed_node3 13355 1727096152.37343: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096152.37348: Calling all_plugins_play to load vars for managed_node3 13355 1727096152.37350: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096152.37353: Calling groups_plugins_play to load vars for managed_node3 13355 1727096152.37499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096152.37673: done with get_vars() 13355 1727096152.37681: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:11 Monday 23 September 2024 08:55:52 -0400 (0:00:01.617) 0:00:01.638 ****** 13355 1727096152.37762: entering _queue_task() for managed_node3/include_tasks 13355 1727096152.37764: Creating lock for include_tasks 13355 1727096152.38127: worker is 1 (out of 1 available) 13355 1727096152.38140: exiting _queue_task() for managed_node3/include_tasks 13355 1727096152.38156: done queuing things up, now waiting for results queue to drain 13355 1727096152.38159: waiting for pending results... 13355 1727096152.38413: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 13355 1727096152.38542: in run() - task 0afff68d-5257-c514-593f-000000000006 13355 1727096152.38562: variable 'ansible_search_path' from source: unknown 13355 1727096152.38652: calling self._execute() 13355 1727096152.38737: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096152.38749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096152.38764: variable 'omit' from source: magic vars 13355 1727096152.38880: _execute() done 13355 1727096152.38901: dumping result to json 13355 1727096152.38909: done dumping result, returning 13355 1727096152.38922: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [0afff68d-5257-c514-593f-000000000006] 13355 1727096152.38950: sending task result for task 0afff68d-5257-c514-593f-000000000006 13355 1727096152.39240: no more pending results, returning what we have 13355 1727096152.39246: in VariableManager get_vars() 13355 1727096152.39289: Calling all_inventory to load vars for managed_node3 13355 1727096152.39293: Calling groups_inventory to load vars for managed_node3 13355 1727096152.39297: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096152.39311: Calling all_plugins_play to load vars for managed_node3 13355 1727096152.39314: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096152.39318: Calling groups_plugins_play to load vars for managed_node3 13355 1727096152.39815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096152.40397: done with get_vars() 13355 1727096152.40405: variable 'ansible_search_path' from source: unknown 13355 1727096152.40562: done sending task result for task 0afff68d-5257-c514-593f-000000000006 13355 1727096152.40565: WORKER PROCESS EXITING 13355 1727096152.40577: we have included files to process 13355 1727096152.40578: generating all_blocks data 13355 1727096152.40580: done generating all_blocks data 13355 1727096152.40581: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13355 1727096152.40582: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13355 1727096152.40585: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13355 1727096152.41541: in VariableManager get_vars() 13355 1727096152.41554: done with get_vars() 13355 1727096152.41562: done processing included file 13355 1727096152.41563: iterating over new_blocks loaded from include file 13355 1727096152.41564: in VariableManager get_vars() 13355 1727096152.41572: done with get_vars() 13355 1727096152.41573: filtering new block on tags 13355 1727096152.41583: done filtering new block on tags 13355 1727096152.41585: in VariableManager get_vars() 13355 1727096152.41591: done with get_vars() 13355 1727096152.41592: filtering new block on tags 13355 1727096152.41602: done filtering new block on tags 13355 1727096152.41604: in VariableManager get_vars() 13355 1727096152.41611: done with get_vars() 13355 1727096152.41612: filtering new block on tags 13355 1727096152.41619: done filtering new block on tags 13355 1727096152.41620: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 13355 1727096152.41624: extending task lists for all hosts with included blocks 13355 1727096152.41657: done extending task lists 13355 1727096152.41658: done processing included files 13355 1727096152.41658: results queue empty 13355 1727096152.41659: checking for any_errors_fatal 13355 1727096152.41660: done checking for any_errors_fatal 13355 1727096152.41660: checking for max_fail_percentage 13355 1727096152.41661: done checking for max_fail_percentage 13355 1727096152.41662: checking to see if all hosts have failed and the running result is not ok 13355 1727096152.41662: done checking to see if all hosts have failed 13355 1727096152.41662: getting the remaining hosts for this loop 13355 1727096152.41663: done getting the remaining hosts for this loop 13355 1727096152.41665: getting the next task for host managed_node3 13355 1727096152.41669: done getting next task for host managed_node3 13355 1727096152.41670: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 13355 1727096152.41672: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096152.41673: getting variables 13355 1727096152.41674: in VariableManager get_vars() 13355 1727096152.41680: Calling all_inventory to load vars for managed_node3 13355 1727096152.41682: Calling groups_inventory to load vars for managed_node3 13355 1727096152.41683: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096152.41687: Calling all_plugins_play to load vars for managed_node3 13355 1727096152.41689: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096152.41690: Calling groups_plugins_play to load vars for managed_node3 13355 1727096152.41795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096152.41903: done with get_vars() 13355 1727096152.41910: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Monday 23 September 2024 08:55:52 -0400 (0:00:00.041) 0:00:01.680 ****** 13355 1727096152.41958: entering _queue_task() for managed_node3/setup 13355 1727096152.42207: worker is 1 (out of 1 available) 13355 1727096152.42220: exiting _queue_task() for managed_node3/setup 13355 1727096152.42233: done queuing things up, now waiting for results queue to drain 13355 1727096152.42234: waiting for pending results... 13355 1727096152.42384: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 13355 1727096152.42448: in run() - task 0afff68d-5257-c514-593f-0000000001cd 13355 1727096152.42465: variable 'ansible_search_path' from source: unknown 13355 1727096152.42470: variable 'ansible_search_path' from source: unknown 13355 1727096152.42495: calling self._execute() 13355 1727096152.42555: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096152.42559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096152.42572: variable 'omit' from source: magic vars 13355 1727096152.42956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096152.45498: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096152.45548: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096152.45585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096152.45609: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096152.45629: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096152.45694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096152.45717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096152.45734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096152.45762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096152.45775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096152.45904: variable 'ansible_facts' from source: unknown 13355 1727096152.45951: variable 'network_test_required_facts' from source: task vars 13355 1727096152.45982: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 13355 1727096152.45988: variable 'omit' from source: magic vars 13355 1727096152.46012: variable 'omit' from source: magic vars 13355 1727096152.46038: variable 'omit' from source: magic vars 13355 1727096152.46061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096152.46084: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096152.46098: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096152.46110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096152.46120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096152.46144: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096152.46149: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096152.46154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096152.46216: Set connection var ansible_shell_executable to /bin/sh 13355 1727096152.46220: Set connection var ansible_shell_type to sh 13355 1727096152.46226: Set connection var ansible_pipelining to False 13355 1727096152.46233: Set connection var ansible_connection to ssh 13355 1727096152.46236: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096152.46243: Set connection var ansible_timeout to 10 13355 1727096152.46265: variable 'ansible_shell_executable' from source: unknown 13355 1727096152.46269: variable 'ansible_connection' from source: unknown 13355 1727096152.46272: variable 'ansible_module_compression' from source: unknown 13355 1727096152.46275: variable 'ansible_shell_type' from source: unknown 13355 1727096152.46277: variable 'ansible_shell_executable' from source: unknown 13355 1727096152.46280: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096152.46282: variable 'ansible_pipelining' from source: unknown 13355 1727096152.46284: variable 'ansible_timeout' from source: unknown 13355 1727096152.46286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096152.46385: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096152.46394: variable 'omit' from source: magic vars 13355 1727096152.46399: starting attempt loop 13355 1727096152.46402: running the handler 13355 1727096152.46416: _low_level_execute_command(): starting 13355 1727096152.46422: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096152.46940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096152.46944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096152.46949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096152.46951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096152.47038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096152.47042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096152.47044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096152.47097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096152.48992: stdout chunk (state=3): >>>/root <<< 13355 1727096152.49155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096152.49173: stdout chunk (state=3): >>><<< 13355 1727096152.49176: stderr chunk (state=3): >>><<< 13355 1727096152.49192: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13355 1727096152.49204: _low_level_execute_command(): starting 13355 1727096152.49210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949 `" && echo ansible-tmp-1727096152.4919279-13475-46536980830949="` echo /root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949 `" ) && sleep 0' 13355 1727096152.49672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096152.49676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096152.49678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096152.49680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096152.49724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096152.49738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096152.49806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096152.52500: stdout chunk (state=3): >>>ansible-tmp-1727096152.4919279-13475-46536980830949=/root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949 <<< 13355 1727096152.52660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096152.52695: stderr chunk (state=3): >>><<< 13355 1727096152.52699: stdout chunk (state=3): >>><<< 13355 1727096152.52716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096152.4919279-13475-46536980830949=/root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13355 1727096152.52760: variable 'ansible_module_compression' from source: unknown 13355 1727096152.52803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13355 1727096152.52849: variable 'ansible_facts' from source: unknown 13355 1727096152.52986: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/AnsiballZ_setup.py 13355 1727096152.53122: Sending initial data 13355 1727096152.53125: Sent initial data (153 bytes) 13355 1727096152.53775: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096152.53780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096152.53824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096152.53843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096152.53874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096152.56134: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096152.56172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096152.56211: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp0ipbd41g /root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/AnsiballZ_setup.py <<< 13355 1727096152.56217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/AnsiballZ_setup.py" <<< 13355 1727096152.56254: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp0ipbd41g" to remote "/root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/AnsiballZ_setup.py" <<< 13355 1727096152.56259: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/AnsiballZ_setup.py" <<< 13355 1727096152.57297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096152.57342: stderr chunk (state=3): >>><<< 13355 1727096152.57345: stdout chunk (state=3): >>><<< 13355 1727096152.57366: done transferring module to remote 13355 1727096152.57383: _low_level_execute_command(): starting 13355 1727096152.57386: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/ /root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/AnsiballZ_setup.py && sleep 0' 13355 1727096152.57856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096152.57863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096152.57866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13355 1727096152.57870: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096152.57872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096152.57925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096152.57929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096152.57933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096152.57970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096152.60587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096152.60593: stdout chunk (state=3): >>><<< 13355 1727096152.60606: stderr chunk (state=3): >>><<< 13355 1727096152.60626: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13355 1727096152.60714: _low_level_execute_command(): starting 13355 1727096152.60718: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/AnsiballZ_setup.py && sleep 0' 13355 1727096152.61410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096152.61548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096152.61603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096152.61623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096152.61654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096152.61743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096152.64958: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 13355 1727096152.64987: stdout chunk (state=3): >>>import _imp # builtin <<< 13355 1727096152.65020: stdout chunk (state=3): >>>import '_thread' # <<< 13355 1727096152.65046: stdout chunk (state=3): >>>import '_warnings' # <<< 13355 1727096152.65063: stdout chunk (state=3): >>>import '_weakref' # <<< 13355 1727096152.65164: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 13355 1727096152.65211: stdout chunk (state=3): >>>import 'posix' # <<< 13355 1727096152.65263: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 13355 1727096152.65274: stdout chunk (state=3): >>># installing zipimport hook <<< 13355 1727096152.65305: stdout chunk (state=3): >>>import 'time' # <<< 13355 1727096152.65357: stdout chunk (state=3): >>>import 'zipimport' # <<< 13355 1727096152.65527: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 13355 1727096152.65557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 13355 1727096152.65574: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44b104d0> <<< 13355 1727096152.65593: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44adfb30> <<< 13355 1727096152.65630: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 13355 1727096152.65646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 13355 1727096152.65670: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44b12a50> <<< 13355 1727096152.65701: stdout chunk (state=3): >>>import '_signal' # <<< 13355 1727096152.65737: stdout chunk (state=3): >>>import '_abc' # <<< 13355 1727096152.65757: stdout chunk (state=3): >>>import 'abc' # <<< 13355 1727096152.65786: stdout chunk (state=3): >>>import 'io' # <<< 13355 1727096152.65843: stdout chunk (state=3): >>>import '_stat' # <<< 13355 1727096152.65851: stdout chunk (state=3): >>>import 'stat' # <<< 13355 1727096152.65988: stdout chunk (state=3): >>>import '_collections_abc' # <<< 13355 1727096152.66030: stdout chunk (state=3): >>>import 'genericpath' # <<< 13355 1727096152.66047: stdout chunk (state=3): >>>import 'posixpath' # <<< 13355 1727096152.66087: stdout chunk (state=3): >>>import 'os' # <<< 13355 1727096152.66232: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf448e5130> <<< 13355 1727096152.66285: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 13355 1727096152.66308: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096152.66332: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf448e5fa0> <<< 13355 1727096152.66377: stdout chunk (state=3): >>>import 'site' # <<< 13355 1727096152.66423: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13355 1727096152.67074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 13355 1727096152.67098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 13355 1727096152.67136: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 13355 1727096152.67156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096152.67190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13355 1727096152.67323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44923ec0> <<< 13355 1727096152.67347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 13355 1727096152.67374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 13355 1727096152.67407: stdout chunk (state=3): >>>import '_operator' # <<< 13355 1727096152.67424: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44923f80> <<< 13355 1727096152.67461: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 13355 1727096152.67502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 13355 1727096152.71163: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4495b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4495bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4493bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449392b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44921070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4497b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4497a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4493a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44978bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449b0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449b0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4491ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12<<< 13355 1727096152.71294: stdout chunk (state=3): >>>/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449c8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449c9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449cacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449ca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449cbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449cb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446bbc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extensio<<< 13355 1727096152.71302: stdout chunk (state=3): >>>n module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446e4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446e4740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446e5070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446e5a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e4920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446b9df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e6e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e5b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4470f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44733560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf447942c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code obje<<< 13355 1727096152.71328: stdout chunk (state=3): >>>ct from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44796a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf447943e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf447552b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445953d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44732360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e7d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcf44595670> # zipimport: found 103 names in '/tmp/ansible_setup_payload_y5rcf4pi/ansible_setup_payload.zip' <<< 13355 1727096152.71345: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.71577: stdout chunk (state=3): >>># zipimport: zlib available<<< 13355 1727096152.71583: stdout chunk (state=3): >>> <<< 13355 1727096152.71624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 13355 1727096152.71628: stdout chunk (state=3): >>> <<< 13355 1727096152.71657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 13355 1727096152.71722: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 13355 1727096152.71726: stdout chunk (state=3): >>> <<< 13355 1727096152.71855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 13355 1727096152.71858: stdout chunk (state=3): >>> <<< 13355 1727096152.71894: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 13355 1727096152.71908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 13355 1727096152.71924: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445ff170><<< 13355 1727096152.71949: stdout chunk (state=3): >>> import '_typing' # <<< 13355 1727096152.72122: stdout chunk (state=3): >>> <<< 13355 1727096152.72245: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445de060> <<< 13355 1727096152.72270: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445dd1f0> <<< 13355 1727096152.72298: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.72342: stdout chunk (state=3): >>>import 'ansible' # <<< 13355 1727096152.72375: stdout chunk (state=3): >>># zipimport: zlib available<<< 13355 1727096152.72380: stdout chunk (state=3): >>> <<< 13355 1727096152.72410: stdout chunk (state=3): >>># zipimport: zlib available<<< 13355 1727096152.72415: stdout chunk (state=3): >>> <<< 13355 1727096152.72442: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.72476: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 13355 1727096152.72480: stdout chunk (state=3): >>> <<< 13355 1727096152.72509: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.74818: stdout chunk (state=3): >>># zipimport: zlib available<<< 13355 1727096152.75035: stdout chunk (state=3): >>> <<< 13355 1727096152.76355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 13355 1727096152.76360: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445fd040> <<< 13355 1727096152.76480: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf4462eb10> <<< 13355 1727096152.76525: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4462e8a0> <<< 13355 1727096152.76564: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4462e1b0> <<< 13355 1727096152.76592: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 13355 1727096152.76606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 13355 1727096152.76611: stdout chunk (state=3): >>> <<< 13355 1727096152.76660: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4462e600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445ffb90> <<< 13355 1727096152.76700: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf4462f890> <<< 13355 1727096152.76736: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.76766: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf4462fad0> <<< 13355 1727096152.76773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13355 1727096152.76854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 13355 1727096152.76910: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4462ff50> <<< 13355 1727096152.76951: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 13355 1727096152.76987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 13355 1727096152.77030: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f29e20> <<< 13355 1727096152.77087: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f2ba40> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 13355 1727096152.77112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 13355 1727096152.77181: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2c410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 13355 1727096152.77221: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 13355 1727096152.77242: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2d5b0> <<< 13355 1727096152.77302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 13355 1727096152.77343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 13355 1727096152.77348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 13355 1727096152.77425: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2ff50> <<< 13355 1727096152.77474: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f34680> <<< 13355 1727096152.77526: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2e2d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13355 1727096152.77560: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 13355 1727096152.77595: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 13355 1727096152.77658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13355 1727096152.77792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 13355 1727096152.77827: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 13355 1727096152.77853: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f37fb0> import '_tokenize' # <<< 13355 1727096152.77947: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f36a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f367e0> <<< 13355 1727096152.77985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13355 1727096152.78109: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f36d50> <<< 13355 1727096152.78170: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2e7e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f7c1d0> <<< 13355 1727096152.78223: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f7c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 13355 1727096152.78279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 13355 1727096152.78324: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f7de20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f7dbe0> <<< 13355 1727096152.78392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 13355 1727096152.78462: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.78466: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f803b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f7e510> <<< 13355 1727096152.78498: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13355 1727096152.78583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 13355 1727096152.78588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 13355 1727096152.78608: stdout chunk (state=3): >>>import '_string' # <<< 13355 1727096152.78676: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f83b90> <<< 13355 1727096152.78875: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f80560> <<< 13355 1727096152.78978: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f84bc0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.78988: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f84d40> <<< 13355 1727096152.79037: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f84e30> <<< 13355 1727096152.79057: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f7c4d0> <<< 13355 1727096152.79083: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 13355 1727096152.79091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 13355 1727096152.79119: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 13355 1727096152.79141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 13355 1727096152.79177: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.79211: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43e0c530> <<< 13355 1727096152.79431: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.79449: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43e0d490> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f86cc0> <<< 13355 1727096152.79504: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f87890> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f868d0> <<< 13355 1727096152.79509: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.79538: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 13355 1727096152.79541: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.79628: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.79721: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.79735: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 13355 1727096152.79766: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 13355 1727096152.79796: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.79917: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.80034: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.80905: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.81802: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 13355 1727096152.81812: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 13355 1727096152.81829: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 13355 1727096152.81862: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 13355 1727096152.81886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096152.81954: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.81957: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43e15670> <<< 13355 1727096152.82080: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 13355 1727096152.82089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 13355 1727096152.82101: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e163c0> <<< 13355 1727096152.82115: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e0d7f0> <<< 13355 1727096152.82171: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 13355 1727096152.82194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.82214: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.82236: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 13355 1727096152.82252: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.82525: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.82735: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 13355 1727096152.82744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 13355 1727096152.82749: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e16450> <<< 13355 1727096152.82768: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.83320: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.83782: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.83850: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.83939: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 13355 1727096152.83977: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.84020: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 13355 1727096152.84024: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.84086: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.84197: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 13355 1727096152.84212: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 13355 1727096152.84263: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.84308: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 13355 1727096152.84319: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.84537: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.84777: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13355 1727096152.84854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 13355 1727096152.84865: stdout chunk (state=3): >>>import '_ast' # <<< 13355 1727096152.84930: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e17560> <<< 13355 1727096152.84955: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85006: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85087: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 13355 1727096152.85112: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 13355 1727096152.85123: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85181: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85211: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 13355 1727096152.85214: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85269: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85303: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85365: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85431: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13355 1727096152.85484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096152.85577: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43e220c0> <<< 13355 1727096152.85619: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e1fd40> <<< 13355 1727096152.85642: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 13355 1727096152.85670: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85721: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85785: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85812: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.85866: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096152.85883: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 13355 1727096152.85908: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13355 1727096152.85929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13355 1727096152.86006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 13355 1727096152.86010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 13355 1727096152.86029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 13355 1727096152.86087: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f0aa50> <<< 13355 1727096152.86130: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4465a720> <<< 13355 1727096152.86222: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e22240> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e14ef0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 13355 1727096152.86236: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86262: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86295: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 13355 1727096152.86354: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 13355 1727096152.86383: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 13355 1727096152.86416: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86459: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86536: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86553: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86571: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86603: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86649: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86685: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86721: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 13355 1727096152.86752: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86808: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86882: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86904: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.86953: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 13355 1727096152.86956: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.87134: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.87314: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.87349: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.87405: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096152.87454: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 13355 1727096152.87458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 13355 1727096152.87474: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 13355 1727096152.87495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 13355 1727096152.87516: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb2270> <<< 13355 1727096152.87559: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 13355 1727096152.87577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 13355 1727096152.87626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 13355 1727096152.87665: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 13355 1727096152.87685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43af4170> <<< 13355 1727096152.87712: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.87730: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43af4740> <<< 13355 1727096152.87793: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e9b260> <<< 13355 1727096152.87835: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb2e10> <<< 13355 1727096152.87839: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb0950> <<< 13355 1727096152.87864: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb05f0> <<< 13355 1727096152.87870: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 13355 1727096152.87920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 13355 1727096152.87955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 13355 1727096152.87961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 13355 1727096152.87985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 13355 1727096152.88033: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.88062: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43af7440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43af6cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43af6ed0> <<< 13355 1727096152.88084: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43af6120> <<< 13355 1727096152.88106: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 13355 1727096152.88249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 13355 1727096152.88268: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43af75f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 13355 1727096152.88302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 13355 1727096152.88343: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43b420f0> <<< 13355 1727096152.88370: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b40110> <<< 13355 1727096152.88405: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb0650> import 'ansible.module_utils.facts.timeout' # <<< 13355 1727096152.88417: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 13355 1727096152.88456: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88459: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88483: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 13355 1727096152.88534: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 13355 1727096152.88611: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88660: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 13355 1727096152.88740: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096152.88761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 13355 1727096152.88784: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88810: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88833: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 13355 1727096152.88880: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 13355 1727096152.88943: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.88981: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.89032: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 13355 1727096152.89048: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.89093: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.89158: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.89209: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.89273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 13355 1727096152.89294: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.89787: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90238: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 13355 1727096152.90241: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90287: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90346: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90377: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90424: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 13355 1727096152.90440: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 13355 1727096152.90466: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90499: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 13355 1727096152.90562: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 13355 1727096152.90634: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90664: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 13355 1727096152.90735: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90773: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 13355 1727096152.90790: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90858: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.90949: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 13355 1727096152.90981: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b43e30> <<< 13355 1727096152.91009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 13355 1727096152.91041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 13355 1727096152.91165: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b42de0> import 'ansible.module_utils.facts.system.local' # <<< 13355 1727096152.91184: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.91250: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.91309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 13355 1727096152.91329: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.91408: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.91509: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 13355 1727096152.91512: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.91580: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.91670: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 13355 1727096152.91673: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.91708: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.91757: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 13355 1727096152.91817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 13355 1727096152.91893: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.91958: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43b82420> <<< 13355 1727096152.92157: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b72210> import 'ansible.module_utils.facts.system.python' # <<< 13355 1727096152.92178: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92231: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92295: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 13355 1727096152.92298: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92383: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92479: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92592: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92737: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 13355 1727096152.92762: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92792: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92843: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 13355 1727096152.92846: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92887: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.92943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 13355 1727096152.92946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 13355 1727096152.92990: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096152.93019: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43b96180> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b72a80> <<< 13355 1727096152.93054: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 13355 1727096152.93065: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93103: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 13355 1727096152.93169: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93312: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 13355 1727096152.93477: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93578: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93679: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93725: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93774: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 13355 1727096152.93807: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93816: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93841: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.93975: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.94130: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 13355 1727096152.94133: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.94250: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.94387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 13355 1727096152.94390: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.94420: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.94455: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.95032: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.95583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 13355 1727096152.95886: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 13355 1727096152.96030: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.96177: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 13355 1727096152.96199: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.96463: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.96708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 13355 1727096152.96739: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.96763: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.96788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 13355 1727096152.96819: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.96888: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.96952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 13355 1727096152.97026: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.97137: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.97297: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.97648: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.97998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 13355 1727096152.98013: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.98061: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.98097: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 13355 1727096152.98125: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.98142: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.98177: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 13355 1727096152.98206: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.98399: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.98504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096152.98572: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096152.98583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 13355 1727096152.98590: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.98679: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.98757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 13355 1727096152.98782: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.99049: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.99323: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 13355 1727096152.99403: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.99441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 13355 1727096152.99644: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 13355 1727096152.99683: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.99734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 13355 1727096152.99875: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096152.99968: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 13355 1727096153.00000: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096153.00006: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 13355 1727096153.00080: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13355 1727096153.00142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 13355 1727096153.00153: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.00182: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.00205: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.00279: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.00346: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.00551: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # <<< 13355 1727096153.00573: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 13355 1727096153.00597: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.00665: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.00735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 13355 1727096153.00745: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.01122: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.01410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 13355 1727096153.01448: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.01483: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.01568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 13355 1727096153.01623: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.01700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 13355 1727096153.01870: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.01966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 13355 1727096153.02092: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.02295: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 13355 1727096153.02343: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.02922: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 13355 1727096153.03003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 13355 1727096153.03027: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43992b40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43990260> <<< 13355 1727096153.03086: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4398bb90> <<< 13355 1727096153.04238: stdout chunk (state=3): >>> <<< 13355 1727096153.04280: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "53", "epoch": "1727096153", "epoch_int": "1727096153", "date": "2024-09-23", "time": "08:55:53", "iso8601_micro": "2024-09-23T12:55:53.032884Z", "iso8601": "2024-09-23T12:55:53Z", "iso8601<<< 13355 1727096153.04289: stdout chunk (state=3): >>>_basic": "20240923T085553032884", "iso8601_basic_short": "20240923T085553", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13355 1727096153.05369: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 13355 1727096153.05374: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport <<< 13355 1727096153.05440: stdout chunk (state=3): >>># cleanup[2] removing _codecs # cleanup[2] removing codecs <<< 13355 1727096153.05465: stdout chunk (state=3): >>># cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap<<< 13355 1727096153.05471: stdout chunk (state=3): >>> # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect <<< 13355 1727096153.05474: stdout chunk (state=3): >>># cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 13355 1727096153.05633: stdout chunk (state=3): >>># cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache<<< 13355 1727096153.05673: stdout chunk (state=3): >>> # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi <<< 13355 1727096153.05694: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips <<< 13355 1727096153.05751: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix <<< 13355 1727096153.05756: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix <<< 13355 1727096153.05758: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux<<< 13355 1727096153.05760: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos<<< 13355 1727096153.05952: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual <<< 13355 1727096153.05961: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 13355 1727096153.06519: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13355 1727096153.06538: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 13355 1727096153.06571: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression <<< 13355 1727096153.06580: stdout chunk (state=3): >>># destroy _lzma <<< 13355 1727096153.06635: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 13355 1727096153.06675: stdout chunk (state=3): >>># destroy ntpath <<< 13355 1727096153.06709: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 13355 1727096153.06712: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 13355 1727096153.06944: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 13355 1727096153.06964: stdout chunk (state=3): >>># destroy queue <<< 13355 1727096153.06974: stdout chunk (state=3): >>># destroy _heapq # destroy _queue<<< 13355 1727096153.06993: stdout chunk (state=3): >>> # destroy multiprocessing.process <<< 13355 1727096153.06998: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors <<< 13355 1727096153.07001: stdout chunk (state=3): >>># destroy _multiprocessing <<< 13355 1727096153.07039: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 13355 1727096153.07060: stdout chunk (state=3): >>># destroy datetime <<< 13355 1727096153.07069: stdout chunk (state=3): >>># destroy subprocess <<< 13355 1727096153.07073: stdout chunk (state=3): >>># destroy base64 <<< 13355 1727096153.07101: stdout chunk (state=3): >>># destroy _ssl <<< 13355 1727096153.07140: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 13355 1727096153.07147: stdout chunk (state=3): >>># destroy getpass # destroy pwd <<< 13355 1727096153.07164: stdout chunk (state=3): >>># destroy termios <<< 13355 1727096153.07172: stdout chunk (state=3): >>># destroy errno # destroy json <<< 13355 1727096153.07207: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 13355 1727096153.07240: stdout chunk (state=3): >>># destroy glob # destroy fnmatch<<< 13355 1727096153.07247: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 13355 1727096153.07314: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna <<< 13355 1727096153.07319: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 13355 1727096153.07352: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 13355 1727096153.07355: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader <<< 13355 1727096153.07364: stdout chunk (state=3): >>># cleanup[3] wiping systemd._journal # cleanup[3] wiping _string <<< 13355 1727096153.07527: stdout chunk (state=3): >>># cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 13355 1727096153.07545: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 <<< 13355 1727096153.07554: stdout chunk (state=3): >>># destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13355 1727096153.07833: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 13355 1727096153.07870: stdout chunk (state=3): >>># destroy _collections <<< 13355 1727096153.07907: stdout chunk (state=3): >>># destroy platform <<< 13355 1727096153.07911: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 13355 1727096153.07924: stdout chunk (state=3): >>># destroy tokenize <<< 13355 1727096153.07957: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 13355 1727096153.07961: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 13355 1727096153.08017: stdout chunk (state=3): >>># destroy _typing <<< 13355 1727096153.08020: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error<<< 13355 1727096153.08025: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 13355 1727096153.08028: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves <<< 13355 1727096153.08056: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 13355 1727096153.08094: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules <<< 13355 1727096153.08100: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 13355 1727096153.08240: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 13355 1727096153.08245: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io<<< 13355 1727096153.08247: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections <<< 13355 1727096153.08270: stdout chunk (state=3): >>># destroy threading <<< 13355 1727096153.08427: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 13355 1727096153.08909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096153.08939: stderr chunk (state=3): >>><<< 13355 1727096153.08942: stdout chunk (state=3): >>><<< 13355 1727096153.09051: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44b104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44adfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44b12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf448e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf448e5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44923ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44923f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4495b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4495bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4493bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449392b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44921070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4497b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4497a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4493a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44978bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449b0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449b0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4491ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449c8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449c9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449cacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449ca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf449cbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449cb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446bbc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446e4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446e4740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446e5070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf446e5a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e4920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446b9df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e6e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e5b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf449b2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4470f1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44733560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf447942c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44796a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf447943e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf447552b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445953d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf44732360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf446e7d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcf44595670> # zipimport: found 103 names in '/tmp/ansible_setup_payload_y5rcf4pi/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445ff170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445de060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445dd1f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445fd040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf4462eb10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4462e8a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4462e1b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4462e600> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf445ffb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf4462f890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf4462fad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4462ff50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f29e20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f2ba40> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2c410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2d5b0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2ff50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f34680> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2e2d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f37fb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f36a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f367e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f36d50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f2e7e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f7c1d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f7c380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f7de20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f7dbe0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f803b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f7e510> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f83b90> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f80560> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f84bc0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f84d40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f84e30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f7c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43e0c530> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43e0d490> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f86cc0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43f87890> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f868d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43e15670> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e163c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e0d7f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e16450> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e17560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43e220c0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e1fd40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43f0aa50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4465a720> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e22240> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e14ef0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb2270> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43af4170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43af4740> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43e9b260> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb2e10> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb0950> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb05f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43af7440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43af6cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43af6ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43af6120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43af75f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43b420f0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b40110> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43eb0650> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b43e30> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b42de0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43b82420> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b72210> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43b96180> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43b72a80> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcf43992b40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf43990260> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcf4398bb90> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "53", "epoch": "1727096153", "epoch_int": "1727096153", "date": "2024-09-23", "time": "08:55:53", "iso8601_micro": "2024-09-23T12:55:53.032884Z", "iso8601": "2024-09-23T12:55:53Z", "iso8601_basic": "20240923T085553032884", "iso8601_basic_short": "20240923T085553", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 13355 1727096153.09877: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096153.09881: _low_level_execute_command(): starting 13355 1727096153.09883: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096152.4919279-13475-46536980830949/ > /dev/null 2>&1 && sleep 0' 13355 1727096153.09886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096153.09888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096153.09890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096153.09892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096153.09895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096153.09905: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096153.09923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13355 1727096153.09926: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096153.09936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.09993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096153.10006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.10061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096153.12790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096153.12816: stderr chunk (state=3): >>><<< 13355 1727096153.12820: stdout chunk (state=3): >>><<< 13355 1727096153.12839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13355 1727096153.12844: handler run complete 13355 1727096153.12879: variable 'ansible_facts' from source: unknown 13355 1727096153.12920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.12996: variable 'ansible_facts' from source: unknown 13355 1727096153.13027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.13066: attempt loop complete, returning result 13355 1727096153.13071: _execute() done 13355 1727096153.13074: dumping result to json 13355 1727096153.13081: done dumping result, returning 13355 1727096153.13090: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0afff68d-5257-c514-593f-0000000001cd] 13355 1727096153.13095: sending task result for task 0afff68d-5257-c514-593f-0000000001cd 13355 1727096153.13222: done sending task result for task 0afff68d-5257-c514-593f-0000000001cd 13355 1727096153.13225: WORKER PROCESS EXITING ok: [managed_node3] 13355 1727096153.13323: no more pending results, returning what we have 13355 1727096153.13326: results queue empty 13355 1727096153.13327: checking for any_errors_fatal 13355 1727096153.13329: done checking for any_errors_fatal 13355 1727096153.13329: checking for max_fail_percentage 13355 1727096153.13331: done checking for max_fail_percentage 13355 1727096153.13331: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.13332: done checking to see if all hosts have failed 13355 1727096153.13333: getting the remaining hosts for this loop 13355 1727096153.13342: done getting the remaining hosts for this loop 13355 1727096153.13347: getting the next task for host managed_node3 13355 1727096153.13354: done getting next task for host managed_node3 13355 1727096153.13357: ^ task is: TASK: Check if system is ostree 13355 1727096153.13360: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.13363: getting variables 13355 1727096153.13364: in VariableManager get_vars() 13355 1727096153.13392: Calling all_inventory to load vars for managed_node3 13355 1727096153.13394: Calling groups_inventory to load vars for managed_node3 13355 1727096153.13397: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.13407: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.13409: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.13411: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.13565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.13683: done with get_vars() 13355 1727096153.13691: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Monday 23 September 2024 08:55:53 -0400 (0:00:00.717) 0:00:02.398 ****** 13355 1727096153.13758: entering _queue_task() for managed_node3/stat 13355 1727096153.13984: worker is 1 (out of 1 available) 13355 1727096153.13997: exiting _queue_task() for managed_node3/stat 13355 1727096153.14009: done queuing things up, now waiting for results queue to drain 13355 1727096153.14011: waiting for pending results... 13355 1727096153.14162: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 13355 1727096153.14226: in run() - task 0afff68d-5257-c514-593f-0000000001cf 13355 1727096153.14247: variable 'ansible_search_path' from source: unknown 13355 1727096153.14251: variable 'ansible_search_path' from source: unknown 13355 1727096153.14276: calling self._execute() 13355 1727096153.14334: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.14339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.14359: variable 'omit' from source: magic vars 13355 1727096153.14621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096153.14825: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096153.14859: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096153.14886: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096153.14915: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096153.14979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096153.14999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096153.15021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096153.15039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096153.15135: Evaluated conditional (not __network_is_ostree is defined): True 13355 1727096153.15139: variable 'omit' from source: magic vars 13355 1727096153.15170: variable 'omit' from source: magic vars 13355 1727096153.15196: variable 'omit' from source: magic vars 13355 1727096153.15216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096153.15242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096153.15257: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096153.15292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096153.15301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096153.15324: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096153.15330: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.15333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.15401: Set connection var ansible_shell_executable to /bin/sh 13355 1727096153.15405: Set connection var ansible_shell_type to sh 13355 1727096153.15411: Set connection var ansible_pipelining to False 13355 1727096153.15415: Set connection var ansible_connection to ssh 13355 1727096153.15420: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096153.15425: Set connection var ansible_timeout to 10 13355 1727096153.15446: variable 'ansible_shell_executable' from source: unknown 13355 1727096153.15450: variable 'ansible_connection' from source: unknown 13355 1727096153.15455: variable 'ansible_module_compression' from source: unknown 13355 1727096153.15458: variable 'ansible_shell_type' from source: unknown 13355 1727096153.15461: variable 'ansible_shell_executable' from source: unknown 13355 1727096153.15463: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.15465: variable 'ansible_pipelining' from source: unknown 13355 1727096153.15469: variable 'ansible_timeout' from source: unknown 13355 1727096153.15471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.15568: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096153.15577: variable 'omit' from source: magic vars 13355 1727096153.15585: starting attempt loop 13355 1727096153.15589: running the handler 13355 1727096153.15600: _low_level_execute_command(): starting 13355 1727096153.15606: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096153.16132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096153.16138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.16141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096153.16144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.16198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096153.16201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096153.16203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.16267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096153.18684: stdout chunk (state=3): >>>/root <<< 13355 1727096153.18829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096153.18870: stderr chunk (state=3): >>><<< 13355 1727096153.18875: stdout chunk (state=3): >>><<< 13355 1727096153.18896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13355 1727096153.18912: _low_level_execute_command(): starting 13355 1727096153.18919: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732 `" && echo ansible-tmp-1727096153.1889527-13504-143674206333732="` echo /root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732 `" ) && sleep 0' 13355 1727096153.19387: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096153.19391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.19394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096153.19396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.19449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096153.19458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096153.19461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.19499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096153.22328: stdout chunk (state=3): >>>ansible-tmp-1727096153.1889527-13504-143674206333732=/root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732 <<< 13355 1727096153.22483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096153.22513: stderr chunk (state=3): >>><<< 13355 1727096153.22516: stdout chunk (state=3): >>><<< 13355 1727096153.22537: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096153.1889527-13504-143674206333732=/root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13355 1727096153.22584: variable 'ansible_module_compression' from source: unknown 13355 1727096153.22631: ANSIBALLZ: Using lock for stat 13355 1727096153.22636: ANSIBALLZ: Acquiring lock 13355 1727096153.22639: ANSIBALLZ: Lock acquired: 140397099652192 13355 1727096153.22641: ANSIBALLZ: Creating module 13355 1727096153.31875: ANSIBALLZ: Writing module into payload 13355 1727096153.31880: ANSIBALLZ: Writing module 13355 1727096153.31888: ANSIBALLZ: Renaming module 13355 1727096153.31900: ANSIBALLZ: Done creating module 13355 1727096153.31921: variable 'ansible_facts' from source: unknown 13355 1727096153.32005: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/AnsiballZ_stat.py 13355 1727096153.32197: Sending initial data 13355 1727096153.32200: Sent initial data (153 bytes) 13355 1727096153.32881: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096153.32903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096153.32919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.32992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096153.35250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096153.35284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096153.35322: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpga_nwo8b /root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/AnsiballZ_stat.py <<< 13355 1727096153.35326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/AnsiballZ_stat.py" <<< 13355 1727096153.35359: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpga_nwo8b" to remote "/root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/AnsiballZ_stat.py" <<< 13355 1727096153.35362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/AnsiballZ_stat.py" <<< 13355 1727096153.36076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096153.36080: stderr chunk (state=3): >>><<< 13355 1727096153.36082: stdout chunk (state=3): >>><<< 13355 1727096153.36119: done transferring module to remote 13355 1727096153.36137: _low_level_execute_command(): starting 13355 1727096153.36146: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/ /root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/AnsiballZ_stat.py && sleep 0' 13355 1727096153.36721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096153.36735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.36750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.36807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096153.36821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.36859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096153.39388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096153.39537: stderr chunk (state=3): >>><<< 13355 1727096153.39541: stdout chunk (state=3): >>><<< 13355 1727096153.39543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13355 1727096153.39545: _low_level_execute_command(): starting 13355 1727096153.39547: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/AnsiballZ_stat.py && sleep 0' 13355 1727096153.40196: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096153.40227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096153.40350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096153.40419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.40475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096153.42763: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 13355 1727096153.42784: stdout chunk (state=3): >>>import _imp # builtin <<< 13355 1727096153.42803: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 13355 1727096153.42832: stdout chunk (state=3): >>>import '_weakref' # <<< 13355 1727096153.42859: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 13355 1727096153.42905: stdout chunk (state=3): >>>import 'posix' # <<< 13355 1727096153.42950: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13355 1727096153.42955: stdout chunk (state=3): >>>import 'time' # <<< 13355 1727096153.42958: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 13355 1727096153.43024: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096153.43028: stdout chunk (state=3): >>>import '_codecs' # <<< 13355 1727096153.43042: stdout chunk (state=3): >>>import 'codecs' # <<< 13355 1727096153.43079: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 13355 1727096153.43150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c2184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c1e7b30> <<< 13355 1727096153.43166: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c21aa50> <<< 13355 1727096153.43202: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 13355 1727096153.43232: stdout chunk (state=3): >>>import 'io' # <<< 13355 1727096153.43261: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 13355 1727096153.43354: stdout chunk (state=3): >>>import '_collections_abc' # <<< 13355 1727096153.43403: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # <<< 13355 1727096153.43409: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 13355 1727096153.43455: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 13355 1727096153.43461: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 13355 1727096153.43499: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c02d130> <<< 13355 1727096153.43570: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 13355 1727096153.43595: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c02dfa0> import 'site' # <<< 13355 1727096153.43631: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13355 1727096153.43860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 13355 1727096153.43878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 13355 1727096153.43895: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 13355 1727096153.43929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13355 1727096153.43957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 13355 1727096153.43983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 13355 1727096153.44010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 13355 1727096153.44013: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c06bec0> <<< 13355 1727096153.44049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 13355 1727096153.44078: stdout chunk (state=3): >>>import '_operator' # <<< 13355 1727096153.44099: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c06bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 13355 1727096153.44118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 13355 1727096153.44144: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 13355 1727096153.44196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096153.44212: stdout chunk (state=3): >>>import 'itertools' # <<< 13355 1727096153.44278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 13355 1727096153.44283: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0a3ec0> <<< 13355 1727096153.44302: stdout chunk (state=3): >>>import '_collections' # <<< 13355 1727096153.44342: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c083b60> <<< 13355 1727096153.44361: stdout chunk (state=3): >>>import '_functools' # <<< 13355 1727096153.44381: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0812b0> <<< 13355 1727096153.44466: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c069070> <<< 13355 1727096153.44512: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 13355 1727096153.44516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 13355 1727096153.44545: stdout chunk (state=3): >>>import '_sre' # <<< 13355 1727096153.44548: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 13355 1727096153.44577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 13355 1727096153.44598: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 13355 1727096153.44638: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0c37d0> <<< 13355 1727096153.44657: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0c23f0> <<< 13355 1727096153.44680: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c082150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0c0bc0> <<< 13355 1727096153.44735: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 13355 1727096153.44771: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 13355 1727096153.44932: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c0f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c0f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c066e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 13355 1727096153.44988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 13355 1727096153.44997: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0f9370> import 'importlib.machinery' # <<< 13355 1727096153.45036: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0fa540> import 'importlib.util' # import 'runpy' # <<< 13355 1727096153.45322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c110740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c111e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c112cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c1132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c112210> <<< 13355 1727096153.45325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 13355 1727096153.45360: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096153.45384: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c113d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c1134a0> <<< 13355 1727096153.45423: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0fa4b0> <<< 13355 1727096153.45444: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 13355 1727096153.45481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 13355 1727096153.45628: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 13355 1727096153.45658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2bed3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2befc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2befc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2befc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 13355 1727096153.45677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 13355 1727096153.45744: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096153.45864: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2befd070> <<< 13355 1727096153.46014: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2befda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2befc920> <<< 13355 1727096153.46040: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bed1df0> <<< 13355 1727096153.46116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 13355 1727096153.46145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2befee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2befdb50> <<< 13355 1727096153.46176: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0fac60> <<< 13355 1727096153.46188: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 13355 1727096153.46250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096153.46264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 13355 1727096153.46294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 13355 1727096153.46320: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bf271a0> <<< 13355 1727096153.46374: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 13355 1727096153.46414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096153.46431: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 13355 1727096153.46434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 13355 1727096153.46462: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bf4b560> <<< 13355 1727096153.46483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 13355 1727096153.46528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 13355 1727096153.46589: stdout chunk (state=3): >>>import 'ntpath' # <<< 13355 1727096153.46624: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bfac2c0> <<< 13355 1727096153.46639: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 13355 1727096153.46653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 13355 1727096153.46682: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13355 1727096153.46717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13355 1727096153.46802: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bfaea20> <<< 13355 1727096153.46878: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bfac3e0> <<< 13355 1727096153.46914: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bf6d2b0> <<< 13355 1727096153.46948: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 13355 1727096153.46975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bdad3d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bf4a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2beffd70> <<< 13355 1727096153.47094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 13355 1727096153.47099: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2a2bdad670> <<< 13355 1727096153.47360: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_1x27u_9d/ansible_stat_payload.zip' # zipimport: zlib available <<< 13355 1727096153.47490: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.47525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 13355 1727096153.47541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13355 1727096153.47580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 13355 1727096153.47665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13355 1727096153.47705: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be03170> <<< 13355 1727096153.47708: stdout chunk (state=3): >>>import '_typing' # <<< 13355 1727096153.47900: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bde2060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bde11f0> <<< 13355 1727096153.48045: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 13355 1727096153.48066: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 13355 1727096153.49583: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.50683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be01040> <<< 13355 1727096153.50715: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 13355 1727096153.50745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 13355 1727096153.50770: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 13355 1727096153.50844: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2be2aae0> <<< 13355 1727096153.50856: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be2a870> <<< 13355 1727096153.51025: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be2a180> <<< 13355 1727096153.51029: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 13355 1727096153.51032: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be2abd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be03e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2be2b860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2be2b9e0> <<< 13355 1727096153.51059: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13355 1727096153.51118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 13355 1727096153.51134: stdout chunk (state=3): >>>import '_locale' # <<< 13355 1727096153.51177: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be2bef0> <<< 13355 1727096153.51229: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 13355 1727096153.51237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 13355 1727096153.51280: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b70dc70> <<< 13355 1727096153.51293: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b70f890> <<< 13355 1727096153.51341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 13355 1727096153.51395: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b710290> <<< 13355 1727096153.51406: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 13355 1727096153.51456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b711430> <<< 13355 1727096153.51476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 13355 1727096153.51508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 13355 1727096153.51540: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 13355 1727096153.51591: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b713f20> <<< 13355 1727096153.51630: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7185c0> <<< 13355 1727096153.51682: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7121e0> <<< 13355 1727096153.51685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13355 1727096153.51702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 13355 1727096153.51728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 13355 1727096153.51762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13355 1727096153.51799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 13355 1727096153.51817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 13355 1727096153.51842: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b71bec0> import '_tokenize' # <<< 13355 1727096153.51911: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b71a990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b71a6f0> <<< 13355 1727096153.51946: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 13355 1727096153.51949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13355 1727096153.52022: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b71ac60> <<< 13355 1727096153.52051: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b712660> <<< 13355 1727096153.52082: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b763ec0> <<< 13355 1727096153.52121: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7641a0> <<< 13355 1727096153.52144: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 13355 1727096153.52189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 13355 1727096153.52194: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 13355 1727096153.52239: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b765c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b765a00> <<< 13355 1727096153.52256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 13355 1727096153.52390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 13355 1727096153.52451: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7681a0> <<< 13355 1727096153.52472: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b766300> <<< 13355 1727096153.52491: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13355 1727096153.52526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096153.52565: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 13355 1727096153.52579: stdout chunk (state=3): >>>import '_string' # <<< 13355 1727096153.52614: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b76b950> <<< 13355 1727096153.52747: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b768350> <<< 13355 1727096153.52816: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b76ca10> <<< 13355 1727096153.52853: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096153.52880: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b76cb60> <<< 13355 1727096153.52945: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b76cb30> <<< 13355 1727096153.52963: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7642c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 13355 1727096153.53098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 13355 1727096153.53109: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7f43b0> <<< 13355 1727096153.53298: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7f55e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b76eb40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b76fef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b76e7b0> # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096153.53316: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 13355 1727096153.53613: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 13355 1727096153.53681: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.53811: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.54376: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.54946: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 13355 1727096153.54978: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 13355 1727096153.55003: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096153.55070: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7fd7c0> <<< 13355 1727096153.55145: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 13355 1727096153.55173: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7fe570> <<< 13355 1727096153.55193: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7f5730> <<< 13355 1727096153.55234: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 13355 1727096153.55262: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.55293: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 13355 1727096153.55307: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.55450: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.55614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 13355 1727096153.55639: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7fe210> <<< 13355 1727096153.55657: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.56134: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.56605: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.56673: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.56751: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 13355 1727096153.56763: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.56798: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.56829: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 13355 1727096153.56847: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.56917: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.56997: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 13355 1727096153.57156: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 13355 1727096153.57160: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 13355 1727096153.57641: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.57790: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13355 1727096153.57897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 13355 1727096153.57931: stdout chunk (state=3): >>>import '_ast' # <<< 13355 1727096153.58050: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7ff620> <<< 13355 1727096153.58078: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.58198: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.58307: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 13355 1727096153.58328: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 13355 1727096153.58348: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 13355 1727096153.58371: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 13355 1727096153.58405: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.58475: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.58726: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13355 1727096153.58760: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.58872: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13355 1727096153.58948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096153.59082: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096153.59101: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 13355 1727096153.59111: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b60a090> <<< 13355 1727096153.59174: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b607d10> <<< 13355 1727096153.59219: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 13355 1727096153.59236: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 13355 1727096153.59265: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.59368: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.59473: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.59517: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.59588: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 13355 1727096153.59610: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 13355 1727096153.59650: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 13355 1727096153.59685: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13355 1727096153.59741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13355 1727096153.59814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 13355 1727096153.59859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 13355 1727096153.59937: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be867b0> <<< 13355 1727096153.59978: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be72480> <<< 13355 1727096153.60182: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7fc770> <<< 13355 1727096153.60449: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b6004d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 13355 1727096153.60457: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 13355 1727096153.60542: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.60658: stdout chunk (state=3): >>># zipimport: zlib available <<< 13355 1727096153.60944: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 13355 1727096153.61284: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 13355 1727096153.61295: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse <<< 13355 1727096153.61331: stdout chunk (state=3): >>># destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 13355 1727096153.61344: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 13355 1727096153.61392: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec <<< 13355 1727096153.61401: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 13355 1727096153.61980: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil <<< 13355 1727096153.61996: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 13355 1727096153.62001: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 13355 1727096153.62025: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 13355 1727096153.62031: stdout chunk (state=3): >>># cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants <<< 13355 1727096153.62071: stdout chunk (state=3): >>># destroy re._casefix # destroy re._compiler <<< 13355 1727096153.62088: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 13355 1727096153.62109: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 13355 1727096153.62239: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13355 1727096153.62341: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 13355 1727096153.62378: stdout chunk (state=3): >>># destroy _collections <<< 13355 1727096153.62406: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 13355 1727096153.62439: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 13355 1727096153.62673: stdout chunk (state=3): >>># destroy _typing <<< 13355 1727096153.62677: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 13355 1727096153.62719: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator <<< 13355 1727096153.62750: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools # destroy _abc <<< 13355 1727096153.62791: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 13355 1727096153.63514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096153.63519: stdout chunk (state=3): >>><<< 13355 1727096153.63521: stderr chunk (state=3): >>><<< 13355 1727096153.63636: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c2184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c1e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c21aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c02d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c02dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c06bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c06bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c083b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c069070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c082150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c0f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c0f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c066e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c110740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c111e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c112cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c1132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c112210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2c113d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c1134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2bed3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2befc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2befc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2befc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2befd070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2befda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2befc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bed1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2befee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2befdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2c0fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bf271a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bf4b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bfac2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bfaea20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bfac3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bf6d2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bdad3d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bf4a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2beffd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2a2bdad670> # zipimport: found 30 names in '/tmp/ansible_stat_payload_1x27u_9d/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be03170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bde2060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2bde11f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be01040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2be2aae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be2a870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be2a180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be2abd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be03e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2be2b860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2be2b9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be2bef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b70dc70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b70f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b710290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b711430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b713f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7185c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7121e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b71bec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b71a990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b71a6f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b71ac60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b712660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b763ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b765c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b765a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7681a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b766300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b76b950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b768350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b76ca10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b76cb60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b76cb30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7642c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7f43b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7f55e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b76eb40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b76fef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b76e7b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b7fd7c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7fe570> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7f5730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7fe210> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7ff620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2a2b60a090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b607d10> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be867b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2be72480> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b7fc770> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2a2b6004d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 13355 1727096153.64429: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096153.64432: _low_level_execute_command(): starting 13355 1727096153.64435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096153.1889527-13504-143674206333732/ > /dev/null 2>&1 && sleep 0' 13355 1727096153.65486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096153.65491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.65595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096153.65598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.65685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.65942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096153.68674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096153.68699: stderr chunk (state=3): >>><<< 13355 1727096153.68702: stdout chunk (state=3): >>><<< 13355 1727096153.68718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096153.68724: handler run complete 13355 1727096153.68746: attempt loop complete, returning result 13355 1727096153.68750: _execute() done 13355 1727096153.68752: dumping result to json 13355 1727096153.68754: done dumping result, returning 13355 1727096153.68764: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [0afff68d-5257-c514-593f-0000000001cf] 13355 1727096153.68767: sending task result for task 0afff68d-5257-c514-593f-0000000001cf 13355 1727096153.68856: done sending task result for task 0afff68d-5257-c514-593f-0000000001cf 13355 1727096153.68859: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 13355 1727096153.68922: no more pending results, returning what we have 13355 1727096153.68925: results queue empty 13355 1727096153.68925: checking for any_errors_fatal 13355 1727096153.68933: done checking for any_errors_fatal 13355 1727096153.68933: checking for max_fail_percentage 13355 1727096153.68935: done checking for max_fail_percentage 13355 1727096153.68935: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.68936: done checking to see if all hosts have failed 13355 1727096153.68937: getting the remaining hosts for this loop 13355 1727096153.68938: done getting the remaining hosts for this loop 13355 1727096153.68941: getting the next task for host managed_node3 13355 1727096153.68947: done getting next task for host managed_node3 13355 1727096153.68949: ^ task is: TASK: Set flag to indicate system is ostree 13355 1727096153.68951: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.68954: getting variables 13355 1727096153.68955: in VariableManager get_vars() 13355 1727096153.68995: Calling all_inventory to load vars for managed_node3 13355 1727096153.68998: Calling groups_inventory to load vars for managed_node3 13355 1727096153.69001: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.69012: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.69014: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.69017: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.69179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.69298: done with get_vars() 13355 1727096153.69308: done getting variables 13355 1727096153.69381: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Monday 23 September 2024 08:55:53 -0400 (0:00:00.556) 0:00:02.954 ****** 13355 1727096153.69404: entering _queue_task() for managed_node3/set_fact 13355 1727096153.69406: Creating lock for set_fact 13355 1727096153.69638: worker is 1 (out of 1 available) 13355 1727096153.69651: exiting _queue_task() for managed_node3/set_fact 13355 1727096153.69664: done queuing things up, now waiting for results queue to drain 13355 1727096153.69666: waiting for pending results... 13355 1727096153.69818: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 13355 1727096153.69888: in run() - task 0afff68d-5257-c514-593f-0000000001d0 13355 1727096153.69905: variable 'ansible_search_path' from source: unknown 13355 1727096153.69909: variable 'ansible_search_path' from source: unknown 13355 1727096153.69938: calling self._execute() 13355 1727096153.70010: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.70014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.70016: variable 'omit' from source: magic vars 13355 1727096153.70361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096153.70564: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096153.70604: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096153.70630: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096153.70658: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096153.70727: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096153.70745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096153.70765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096153.70789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096153.70880: Evaluated conditional (not __network_is_ostree is defined): True 13355 1727096153.70883: variable 'omit' from source: magic vars 13355 1727096153.70913: variable 'omit' from source: magic vars 13355 1727096153.71003: variable '__ostree_booted_stat' from source: set_fact 13355 1727096153.71035: variable 'omit' from source: magic vars 13355 1727096153.71058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096153.71079: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096153.71093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096153.71109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096153.71121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096153.71140: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096153.71143: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.71146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.71218: Set connection var ansible_shell_executable to /bin/sh 13355 1727096153.71226: Set connection var ansible_shell_type to sh 13355 1727096153.71229: Set connection var ansible_pipelining to False 13355 1727096153.71232: Set connection var ansible_connection to ssh 13355 1727096153.71234: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096153.71237: Set connection var ansible_timeout to 10 13355 1727096153.71329: variable 'ansible_shell_executable' from source: unknown 13355 1727096153.71334: variable 'ansible_connection' from source: unknown 13355 1727096153.71336: variable 'ansible_module_compression' from source: unknown 13355 1727096153.71339: variable 'ansible_shell_type' from source: unknown 13355 1727096153.71341: variable 'ansible_shell_executable' from source: unknown 13355 1727096153.71343: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.71347: variable 'ansible_pipelining' from source: unknown 13355 1727096153.71350: variable 'ansible_timeout' from source: unknown 13355 1727096153.71354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.71357: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096153.71360: variable 'omit' from source: magic vars 13355 1727096153.71362: starting attempt loop 13355 1727096153.71364: running the handler 13355 1727096153.71368: handler run complete 13355 1727096153.71381: attempt loop complete, returning result 13355 1727096153.71383: _execute() done 13355 1727096153.71385: dumping result to json 13355 1727096153.71387: done dumping result, returning 13355 1727096153.71391: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [0afff68d-5257-c514-593f-0000000001d0] 13355 1727096153.71396: sending task result for task 0afff68d-5257-c514-593f-0000000001d0 13355 1727096153.71477: done sending task result for task 0afff68d-5257-c514-593f-0000000001d0 13355 1727096153.71481: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 13355 1727096153.71531: no more pending results, returning what we have 13355 1727096153.71534: results queue empty 13355 1727096153.71535: checking for any_errors_fatal 13355 1727096153.71542: done checking for any_errors_fatal 13355 1727096153.71543: checking for max_fail_percentage 13355 1727096153.71545: done checking for max_fail_percentage 13355 1727096153.71545: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.71546: done checking to see if all hosts have failed 13355 1727096153.71547: getting the remaining hosts for this loop 13355 1727096153.71548: done getting the remaining hosts for this loop 13355 1727096153.71551: getting the next task for host managed_node3 13355 1727096153.71561: done getting next task for host managed_node3 13355 1727096153.71564: ^ task is: TASK: Fix CentOS6 Base repo 13355 1727096153.71566: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.71571: getting variables 13355 1727096153.71572: in VariableManager get_vars() 13355 1727096153.71601: Calling all_inventory to load vars for managed_node3 13355 1727096153.71603: Calling groups_inventory to load vars for managed_node3 13355 1727096153.71606: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.71615: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.71618: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.71627: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.71801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.71918: done with get_vars() 13355 1727096153.71925: done getting variables 13355 1727096153.72017: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Monday 23 September 2024 08:55:53 -0400 (0:00:00.026) 0:00:02.981 ****** 13355 1727096153.72039: entering _queue_task() for managed_node3/copy 13355 1727096153.72280: worker is 1 (out of 1 available) 13355 1727096153.72291: exiting _queue_task() for managed_node3/copy 13355 1727096153.72303: done queuing things up, now waiting for results queue to drain 13355 1727096153.72304: waiting for pending results... 13355 1727096153.72456: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 13355 1727096153.72512: in run() - task 0afff68d-5257-c514-593f-0000000001d2 13355 1727096153.72525: variable 'ansible_search_path' from source: unknown 13355 1727096153.72529: variable 'ansible_search_path' from source: unknown 13355 1727096153.72648: calling self._execute() 13355 1727096153.72654: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.72657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.72660: variable 'omit' from source: magic vars 13355 1727096153.72915: variable 'ansible_distribution' from source: facts 13355 1727096153.72933: Evaluated conditional (ansible_distribution == 'CentOS'): True 13355 1727096153.73016: variable 'ansible_distribution_major_version' from source: facts 13355 1727096153.73021: Evaluated conditional (ansible_distribution_major_version == '6'): False 13355 1727096153.73023: when evaluation is False, skipping this task 13355 1727096153.73026: _execute() done 13355 1727096153.73030: dumping result to json 13355 1727096153.73034: done dumping result, returning 13355 1727096153.73042: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [0afff68d-5257-c514-593f-0000000001d2] 13355 1727096153.73047: sending task result for task 0afff68d-5257-c514-593f-0000000001d2 13355 1727096153.73137: done sending task result for task 0afff68d-5257-c514-593f-0000000001d2 13355 1727096153.73140: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13355 1727096153.73207: no more pending results, returning what we have 13355 1727096153.73211: results queue empty 13355 1727096153.73212: checking for any_errors_fatal 13355 1727096153.73216: done checking for any_errors_fatal 13355 1727096153.73217: checking for max_fail_percentage 13355 1727096153.73219: done checking for max_fail_percentage 13355 1727096153.73220: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.73220: done checking to see if all hosts have failed 13355 1727096153.73221: getting the remaining hosts for this loop 13355 1727096153.73222: done getting the remaining hosts for this loop 13355 1727096153.73226: getting the next task for host managed_node3 13355 1727096153.73233: done getting next task for host managed_node3 13355 1727096153.73235: ^ task is: TASK: Include the task 'enable_epel.yml' 13355 1727096153.73238: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.73241: getting variables 13355 1727096153.73242: in VariableManager get_vars() 13355 1727096153.73274: Calling all_inventory to load vars for managed_node3 13355 1727096153.73277: Calling groups_inventory to load vars for managed_node3 13355 1727096153.73280: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.73289: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.73292: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.73294: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.73423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.73562: done with get_vars() 13355 1727096153.73573: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Monday 23 September 2024 08:55:53 -0400 (0:00:00.015) 0:00:02.997 ****** 13355 1727096153.73637: entering _queue_task() for managed_node3/include_tasks 13355 1727096153.73864: worker is 1 (out of 1 available) 13355 1727096153.73882: exiting _queue_task() for managed_node3/include_tasks 13355 1727096153.73896: done queuing things up, now waiting for results queue to drain 13355 1727096153.73898: waiting for pending results... 13355 1727096153.74046: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 13355 1727096153.74120: in run() - task 0afff68d-5257-c514-593f-0000000001d3 13355 1727096153.74135: variable 'ansible_search_path' from source: unknown 13355 1727096153.74138: variable 'ansible_search_path' from source: unknown 13355 1727096153.74166: calling self._execute() 13355 1727096153.74223: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.74228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.74245: variable 'omit' from source: magic vars 13355 1727096153.74532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096153.76081: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096153.76136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096153.76166: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096153.76197: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096153.76215: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096153.76278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096153.76299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096153.76322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096153.76347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096153.76361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096153.76451: variable '__network_is_ostree' from source: set_fact 13355 1727096153.76469: Evaluated conditional (not __network_is_ostree | d(false)): True 13355 1727096153.76475: _execute() done 13355 1727096153.76478: dumping result to json 13355 1727096153.76480: done dumping result, returning 13355 1727096153.76486: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [0afff68d-5257-c514-593f-0000000001d3] 13355 1727096153.76491: sending task result for task 0afff68d-5257-c514-593f-0000000001d3 13355 1727096153.76579: done sending task result for task 0afff68d-5257-c514-593f-0000000001d3 13355 1727096153.76582: WORKER PROCESS EXITING 13355 1727096153.76614: no more pending results, returning what we have 13355 1727096153.76619: in VariableManager get_vars() 13355 1727096153.76652: Calling all_inventory to load vars for managed_node3 13355 1727096153.76655: Calling groups_inventory to load vars for managed_node3 13355 1727096153.76658: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.76671: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.76673: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.76676: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.76843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.76958: done with get_vars() 13355 1727096153.76965: variable 'ansible_search_path' from source: unknown 13355 1727096153.76966: variable 'ansible_search_path' from source: unknown 13355 1727096153.76998: we have included files to process 13355 1727096153.76998: generating all_blocks data 13355 1727096153.77000: done generating all_blocks data 13355 1727096153.77005: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13355 1727096153.77006: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13355 1727096153.77007: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13355 1727096153.77484: done processing included file 13355 1727096153.77486: iterating over new_blocks loaded from include file 13355 1727096153.77487: in VariableManager get_vars() 13355 1727096153.77495: done with get_vars() 13355 1727096153.77496: filtering new block on tags 13355 1727096153.77511: done filtering new block on tags 13355 1727096153.77512: in VariableManager get_vars() 13355 1727096153.77519: done with get_vars() 13355 1727096153.77520: filtering new block on tags 13355 1727096153.77526: done filtering new block on tags 13355 1727096153.77527: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 13355 1727096153.77532: extending task lists for all hosts with included blocks 13355 1727096153.77595: done extending task lists 13355 1727096153.77596: done processing included files 13355 1727096153.77597: results queue empty 13355 1727096153.77597: checking for any_errors_fatal 13355 1727096153.77600: done checking for any_errors_fatal 13355 1727096153.77600: checking for max_fail_percentage 13355 1727096153.77601: done checking for max_fail_percentage 13355 1727096153.77602: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.77602: done checking to see if all hosts have failed 13355 1727096153.77602: getting the remaining hosts for this loop 13355 1727096153.77603: done getting the remaining hosts for this loop 13355 1727096153.77605: getting the next task for host managed_node3 13355 1727096153.77607: done getting next task for host managed_node3 13355 1727096153.77609: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 13355 1727096153.77610: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.77612: getting variables 13355 1727096153.77612: in VariableManager get_vars() 13355 1727096153.77618: Calling all_inventory to load vars for managed_node3 13355 1727096153.77620: Calling groups_inventory to load vars for managed_node3 13355 1727096153.77621: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.77625: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.77631: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.77633: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.77731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.77839: done with get_vars() 13355 1727096153.77846: done getting variables 13355 1727096153.77899: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 13355 1727096153.77989: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Monday 23 September 2024 08:55:53 -0400 (0:00:00.043) 0:00:03.040 ****** 13355 1727096153.78023: entering _queue_task() for managed_node3/command 13355 1727096153.78024: Creating lock for command 13355 1727096153.78269: worker is 1 (out of 1 available) 13355 1727096153.78282: exiting _queue_task() for managed_node3/command 13355 1727096153.78294: done queuing things up, now waiting for results queue to drain 13355 1727096153.78295: waiting for pending results... 13355 1727096153.78451: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 13355 1727096153.78526: in run() - task 0afff68d-5257-c514-593f-0000000001ed 13355 1727096153.78544: variable 'ansible_search_path' from source: unknown 13355 1727096153.78547: variable 'ansible_search_path' from source: unknown 13355 1727096153.78578: calling self._execute() 13355 1727096153.78641: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.78645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.78656: variable 'omit' from source: magic vars 13355 1727096153.78928: variable 'ansible_distribution' from source: facts 13355 1727096153.78936: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13355 1727096153.79026: variable 'ansible_distribution_major_version' from source: facts 13355 1727096153.79030: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13355 1727096153.79035: when evaluation is False, skipping this task 13355 1727096153.79037: _execute() done 13355 1727096153.79040: dumping result to json 13355 1727096153.79044: done dumping result, returning 13355 1727096153.79054: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [0afff68d-5257-c514-593f-0000000001ed] 13355 1727096153.79057: sending task result for task 0afff68d-5257-c514-593f-0000000001ed 13355 1727096153.79150: done sending task result for task 0afff68d-5257-c514-593f-0000000001ed 13355 1727096153.79155: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13355 1727096153.79220: no more pending results, returning what we have 13355 1727096153.79224: results queue empty 13355 1727096153.79225: checking for any_errors_fatal 13355 1727096153.79226: done checking for any_errors_fatal 13355 1727096153.79227: checking for max_fail_percentage 13355 1727096153.79228: done checking for max_fail_percentage 13355 1727096153.79228: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.79229: done checking to see if all hosts have failed 13355 1727096153.79230: getting the remaining hosts for this loop 13355 1727096153.79231: done getting the remaining hosts for this loop 13355 1727096153.79234: getting the next task for host managed_node3 13355 1727096153.79240: done getting next task for host managed_node3 13355 1727096153.79243: ^ task is: TASK: Install yum-utils package 13355 1727096153.79246: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.79250: getting variables 13355 1727096153.79251: in VariableManager get_vars() 13355 1727096153.79285: Calling all_inventory to load vars for managed_node3 13355 1727096153.79287: Calling groups_inventory to load vars for managed_node3 13355 1727096153.79290: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.79300: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.79302: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.79304: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.79439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.79556: done with get_vars() 13355 1727096153.79564: done getting variables 13355 1727096153.79640: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Monday 23 September 2024 08:55:53 -0400 (0:00:00.016) 0:00:03.057 ****** 13355 1727096153.79663: entering _queue_task() for managed_node3/package 13355 1727096153.79664: Creating lock for package 13355 1727096153.79897: worker is 1 (out of 1 available) 13355 1727096153.79911: exiting _queue_task() for managed_node3/package 13355 1727096153.79923: done queuing things up, now waiting for results queue to drain 13355 1727096153.79925: waiting for pending results... 13355 1727096153.80085: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 13355 1727096153.80162: in run() - task 0afff68d-5257-c514-593f-0000000001ee 13355 1727096153.80173: variable 'ansible_search_path' from source: unknown 13355 1727096153.80177: variable 'ansible_search_path' from source: unknown 13355 1727096153.80204: calling self._execute() 13355 1727096153.80270: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.80274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.80284: variable 'omit' from source: magic vars 13355 1727096153.80615: variable 'ansible_distribution' from source: facts 13355 1727096153.80626: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13355 1727096153.80718: variable 'ansible_distribution_major_version' from source: facts 13355 1727096153.80722: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13355 1727096153.80725: when evaluation is False, skipping this task 13355 1727096153.80728: _execute() done 13355 1727096153.80730: dumping result to json 13355 1727096153.80735: done dumping result, returning 13355 1727096153.80742: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [0afff68d-5257-c514-593f-0000000001ee] 13355 1727096153.80747: sending task result for task 0afff68d-5257-c514-593f-0000000001ee 13355 1727096153.80838: done sending task result for task 0afff68d-5257-c514-593f-0000000001ee 13355 1727096153.80841: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13355 1727096153.80887: no more pending results, returning what we have 13355 1727096153.80890: results queue empty 13355 1727096153.80891: checking for any_errors_fatal 13355 1727096153.80896: done checking for any_errors_fatal 13355 1727096153.80897: checking for max_fail_percentage 13355 1727096153.80899: done checking for max_fail_percentage 13355 1727096153.80899: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.80900: done checking to see if all hosts have failed 13355 1727096153.80900: getting the remaining hosts for this loop 13355 1727096153.80901: done getting the remaining hosts for this loop 13355 1727096153.80905: getting the next task for host managed_node3 13355 1727096153.80912: done getting next task for host managed_node3 13355 1727096153.80914: ^ task is: TASK: Enable EPEL 7 13355 1727096153.80918: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.80921: getting variables 13355 1727096153.80922: in VariableManager get_vars() 13355 1727096153.80951: Calling all_inventory to load vars for managed_node3 13355 1727096153.80953: Calling groups_inventory to load vars for managed_node3 13355 1727096153.80956: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.80969: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.80972: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.80974: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.81140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.81258: done with get_vars() 13355 1727096153.81265: done getting variables 13355 1727096153.81310: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Monday 23 September 2024 08:55:53 -0400 (0:00:00.016) 0:00:03.073 ****** 13355 1727096153.81331: entering _queue_task() for managed_node3/command 13355 1727096153.81543: worker is 1 (out of 1 available) 13355 1727096153.81559: exiting _queue_task() for managed_node3/command 13355 1727096153.81573: done queuing things up, now waiting for results queue to drain 13355 1727096153.81574: waiting for pending results... 13355 1727096153.81721: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 13355 1727096153.81793: in run() - task 0afff68d-5257-c514-593f-0000000001ef 13355 1727096153.81805: variable 'ansible_search_path' from source: unknown 13355 1727096153.81813: variable 'ansible_search_path' from source: unknown 13355 1727096153.81862: calling self._execute() 13355 1727096153.81926: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.81930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.81938: variable 'omit' from source: magic vars 13355 1727096153.82217: variable 'ansible_distribution' from source: facts 13355 1727096153.82227: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13355 1727096153.82317: variable 'ansible_distribution_major_version' from source: facts 13355 1727096153.82321: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13355 1727096153.82324: when evaluation is False, skipping this task 13355 1727096153.82327: _execute() done 13355 1727096153.82330: dumping result to json 13355 1727096153.82334: done dumping result, returning 13355 1727096153.82342: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [0afff68d-5257-c514-593f-0000000001ef] 13355 1727096153.82344: sending task result for task 0afff68d-5257-c514-593f-0000000001ef 13355 1727096153.82434: done sending task result for task 0afff68d-5257-c514-593f-0000000001ef 13355 1727096153.82436: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13355 1727096153.82502: no more pending results, returning what we have 13355 1727096153.82506: results queue empty 13355 1727096153.82506: checking for any_errors_fatal 13355 1727096153.82513: done checking for any_errors_fatal 13355 1727096153.82514: checking for max_fail_percentage 13355 1727096153.82515: done checking for max_fail_percentage 13355 1727096153.82516: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.82516: done checking to see if all hosts have failed 13355 1727096153.82517: getting the remaining hosts for this loop 13355 1727096153.82518: done getting the remaining hosts for this loop 13355 1727096153.82522: getting the next task for host managed_node3 13355 1727096153.82528: done getting next task for host managed_node3 13355 1727096153.82530: ^ task is: TASK: Enable EPEL 8 13355 1727096153.82533: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.82536: getting variables 13355 1727096153.82537: in VariableManager get_vars() 13355 1727096153.82562: Calling all_inventory to load vars for managed_node3 13355 1727096153.82564: Calling groups_inventory to load vars for managed_node3 13355 1727096153.82569: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.82579: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.82581: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.82583: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.82709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.82848: done with get_vars() 13355 1727096153.82858: done getting variables 13355 1727096153.82900: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Monday 23 September 2024 08:55:53 -0400 (0:00:00.015) 0:00:03.089 ****** 13355 1727096153.82922: entering _queue_task() for managed_node3/command 13355 1727096153.83137: worker is 1 (out of 1 available) 13355 1727096153.83149: exiting _queue_task() for managed_node3/command 13355 1727096153.83163: done queuing things up, now waiting for results queue to drain 13355 1727096153.83165: waiting for pending results... 13355 1727096153.83315: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 13355 1727096153.83382: in run() - task 0afff68d-5257-c514-593f-0000000001f0 13355 1727096153.83426: variable 'ansible_search_path' from source: unknown 13355 1727096153.83430: variable 'ansible_search_path' from source: unknown 13355 1727096153.83453: calling self._execute() 13355 1727096153.83531: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.83543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.83673: variable 'omit' from source: magic vars 13355 1727096153.83954: variable 'ansible_distribution' from source: facts 13355 1727096153.83976: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13355 1727096153.84111: variable 'ansible_distribution_major_version' from source: facts 13355 1727096153.84124: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13355 1727096153.84132: when evaluation is False, skipping this task 13355 1727096153.84138: _execute() done 13355 1727096153.84144: dumping result to json 13355 1727096153.84151: done dumping result, returning 13355 1727096153.84163: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [0afff68d-5257-c514-593f-0000000001f0] 13355 1727096153.84176: sending task result for task 0afff68d-5257-c514-593f-0000000001f0 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13355 1727096153.84449: no more pending results, returning what we have 13355 1727096153.84455: results queue empty 13355 1727096153.84456: checking for any_errors_fatal 13355 1727096153.84466: done checking for any_errors_fatal 13355 1727096153.84469: checking for max_fail_percentage 13355 1727096153.84471: done checking for max_fail_percentage 13355 1727096153.84471: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.84472: done checking to see if all hosts have failed 13355 1727096153.84473: getting the remaining hosts for this loop 13355 1727096153.84474: done getting the remaining hosts for this loop 13355 1727096153.84477: getting the next task for host managed_node3 13355 1727096153.84486: done getting next task for host managed_node3 13355 1727096153.84488: ^ task is: TASK: Enable EPEL 6 13355 1727096153.84491: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.84494: getting variables 13355 1727096153.84496: in VariableManager get_vars() 13355 1727096153.84522: Calling all_inventory to load vars for managed_node3 13355 1727096153.84524: Calling groups_inventory to load vars for managed_node3 13355 1727096153.84527: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.84535: done sending task result for task 0afff68d-5257-c514-593f-0000000001f0 13355 1727096153.84538: WORKER PROCESS EXITING 13355 1727096153.84614: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.84618: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.84621: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.84805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.85007: done with get_vars() 13355 1727096153.85018: done getting variables 13355 1727096153.85078: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Monday 23 September 2024 08:55:53 -0400 (0:00:00.021) 0:00:03.111 ****** 13355 1727096153.85110: entering _queue_task() for managed_node3/copy 13355 1727096153.85413: worker is 1 (out of 1 available) 13355 1727096153.85424: exiting _queue_task() for managed_node3/copy 13355 1727096153.85437: done queuing things up, now waiting for results queue to drain 13355 1727096153.85438: waiting for pending results... 13355 1727096153.85693: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 13355 1727096153.85811: in run() - task 0afff68d-5257-c514-593f-0000000001f2 13355 1727096153.85830: variable 'ansible_search_path' from source: unknown 13355 1727096153.85838: variable 'ansible_search_path' from source: unknown 13355 1727096153.85882: calling self._execute() 13355 1727096153.85965: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.85980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.86073: variable 'omit' from source: magic vars 13355 1727096153.86365: variable 'ansible_distribution' from source: facts 13355 1727096153.86385: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13355 1727096153.86497: variable 'ansible_distribution_major_version' from source: facts 13355 1727096153.86508: Evaluated conditional (ansible_distribution_major_version == '6'): False 13355 1727096153.86516: when evaluation is False, skipping this task 13355 1727096153.86522: _execute() done 13355 1727096153.86529: dumping result to json 13355 1727096153.86535: done dumping result, returning 13355 1727096153.86550: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [0afff68d-5257-c514-593f-0000000001f2] 13355 1727096153.86560: sending task result for task 0afff68d-5257-c514-593f-0000000001f2 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13355 1727096153.86719: no more pending results, returning what we have 13355 1727096153.86723: results queue empty 13355 1727096153.86724: checking for any_errors_fatal 13355 1727096153.86729: done checking for any_errors_fatal 13355 1727096153.86730: checking for max_fail_percentage 13355 1727096153.86732: done checking for max_fail_percentage 13355 1727096153.86732: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.86733: done checking to see if all hosts have failed 13355 1727096153.86734: getting the remaining hosts for this loop 13355 1727096153.86735: done getting the remaining hosts for this loop 13355 1727096153.86739: getting the next task for host managed_node3 13355 1727096153.86751: done getting next task for host managed_node3 13355 1727096153.86754: ^ task is: TASK: Set network provider to 'nm' 13355 1727096153.86757: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.86762: getting variables 13355 1727096153.86764: in VariableManager get_vars() 13355 1727096153.86799: Calling all_inventory to load vars for managed_node3 13355 1727096153.86802: Calling groups_inventory to load vars for managed_node3 13355 1727096153.86805: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.86819: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.86821: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.86824: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.87327: done sending task result for task 0afff68d-5257-c514-593f-0000000001f2 13355 1727096153.87330: WORKER PROCESS EXITING 13355 1727096153.87354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.87532: done with get_vars() 13355 1727096153.87551: done getting variables 13355 1727096153.87630: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:13 Monday 23 September 2024 08:55:53 -0400 (0:00:00.025) 0:00:03.137 ****** 13355 1727096153.87662: entering _queue_task() for managed_node3/set_fact 13355 1727096153.87996: worker is 1 (out of 1 available) 13355 1727096153.88007: exiting _queue_task() for managed_node3/set_fact 13355 1727096153.88019: done queuing things up, now waiting for results queue to drain 13355 1727096153.88020: waiting for pending results... 13355 1727096153.88298: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 13355 1727096153.88379: in run() - task 0afff68d-5257-c514-593f-000000000007 13355 1727096153.88395: variable 'ansible_search_path' from source: unknown 13355 1727096153.88424: calling self._execute() 13355 1727096153.88500: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.88506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.88515: variable 'omit' from source: magic vars 13355 1727096153.88594: variable 'omit' from source: magic vars 13355 1727096153.88619: variable 'omit' from source: magic vars 13355 1727096153.88644: variable 'omit' from source: magic vars 13355 1727096153.88683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096153.88721: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096153.88737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096153.88751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096153.88765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096153.88790: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096153.88795: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.88798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.88875: Set connection var ansible_shell_executable to /bin/sh 13355 1727096153.88879: Set connection var ansible_shell_type to sh 13355 1727096153.88885: Set connection var ansible_pipelining to False 13355 1727096153.88889: Set connection var ansible_connection to ssh 13355 1727096153.88894: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096153.88899: Set connection var ansible_timeout to 10 13355 1727096153.88920: variable 'ansible_shell_executable' from source: unknown 13355 1727096153.88923: variable 'ansible_connection' from source: unknown 13355 1727096153.88927: variable 'ansible_module_compression' from source: unknown 13355 1727096153.88929: variable 'ansible_shell_type' from source: unknown 13355 1727096153.88932: variable 'ansible_shell_executable' from source: unknown 13355 1727096153.88934: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.88936: variable 'ansible_pipelining' from source: unknown 13355 1727096153.88939: variable 'ansible_timeout' from source: unknown 13355 1727096153.88941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.89050: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096153.89061: variable 'omit' from source: magic vars 13355 1727096153.89066: starting attempt loop 13355 1727096153.89071: running the handler 13355 1727096153.89082: handler run complete 13355 1727096153.89090: attempt loop complete, returning result 13355 1727096153.89092: _execute() done 13355 1727096153.89094: dumping result to json 13355 1727096153.89098: done dumping result, returning 13355 1727096153.89105: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [0afff68d-5257-c514-593f-000000000007] 13355 1727096153.89110: sending task result for task 0afff68d-5257-c514-593f-000000000007 13355 1727096153.89192: done sending task result for task 0afff68d-5257-c514-593f-000000000007 13355 1727096153.89195: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 13355 1727096153.89250: no more pending results, returning what we have 13355 1727096153.89253: results queue empty 13355 1727096153.89253: checking for any_errors_fatal 13355 1727096153.89262: done checking for any_errors_fatal 13355 1727096153.89263: checking for max_fail_percentage 13355 1727096153.89264: done checking for max_fail_percentage 13355 1727096153.89265: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.89265: done checking to see if all hosts have failed 13355 1727096153.89266: getting the remaining hosts for this loop 13355 1727096153.89269: done getting the remaining hosts for this loop 13355 1727096153.89272: getting the next task for host managed_node3 13355 1727096153.89279: done getting next task for host managed_node3 13355 1727096153.89281: ^ task is: TASK: meta (flush_handlers) 13355 1727096153.89283: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.89286: getting variables 13355 1727096153.89287: in VariableManager get_vars() 13355 1727096153.89319: Calling all_inventory to load vars for managed_node3 13355 1727096153.89321: Calling groups_inventory to load vars for managed_node3 13355 1727096153.89324: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.89334: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.89336: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.89339: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.89474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.89594: done with get_vars() 13355 1727096153.89602: done getting variables 13355 1727096153.89647: in VariableManager get_vars() 13355 1727096153.89656: Calling all_inventory to load vars for managed_node3 13355 1727096153.89657: Calling groups_inventory to load vars for managed_node3 13355 1727096153.89659: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.89662: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.89663: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.89665: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.89748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.89880: done with get_vars() 13355 1727096153.89890: done queuing things up, now waiting for results queue to drain 13355 1727096153.89891: results queue empty 13355 1727096153.89891: checking for any_errors_fatal 13355 1727096153.89893: done checking for any_errors_fatal 13355 1727096153.89893: checking for max_fail_percentage 13355 1727096153.89894: done checking for max_fail_percentage 13355 1727096153.89894: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.89895: done checking to see if all hosts have failed 13355 1727096153.89895: getting the remaining hosts for this loop 13355 1727096153.89896: done getting the remaining hosts for this loop 13355 1727096153.89897: getting the next task for host managed_node3 13355 1727096153.89900: done getting next task for host managed_node3 13355 1727096153.89901: ^ task is: TASK: meta (flush_handlers) 13355 1727096153.89902: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.89908: getting variables 13355 1727096153.89909: in VariableManager get_vars() 13355 1727096153.89915: Calling all_inventory to load vars for managed_node3 13355 1727096153.89917: Calling groups_inventory to load vars for managed_node3 13355 1727096153.89919: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.89923: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.89928: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.89931: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.90043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.90180: done with get_vars() 13355 1727096153.90188: done getting variables 13355 1727096153.90225: in VariableManager get_vars() 13355 1727096153.90232: Calling all_inventory to load vars for managed_node3 13355 1727096153.90233: Calling groups_inventory to load vars for managed_node3 13355 1727096153.90234: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.90238: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.90242: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.90245: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.90348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.90689: done with get_vars() 13355 1727096153.90701: done queuing things up, now waiting for results queue to drain 13355 1727096153.90702: results queue empty 13355 1727096153.90703: checking for any_errors_fatal 13355 1727096153.90705: done checking for any_errors_fatal 13355 1727096153.90705: checking for max_fail_percentage 13355 1727096153.90706: done checking for max_fail_percentage 13355 1727096153.90707: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.90708: done checking to see if all hosts have failed 13355 1727096153.90708: getting the remaining hosts for this loop 13355 1727096153.90709: done getting the remaining hosts for this loop 13355 1727096153.90711: getting the next task for host managed_node3 13355 1727096153.90714: done getting next task for host managed_node3 13355 1727096153.90715: ^ task is: None 13355 1727096153.90716: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.90717: done queuing things up, now waiting for results queue to drain 13355 1727096153.90718: results queue empty 13355 1727096153.90719: checking for any_errors_fatal 13355 1727096153.90719: done checking for any_errors_fatal 13355 1727096153.90720: checking for max_fail_percentage 13355 1727096153.90721: done checking for max_fail_percentage 13355 1727096153.90721: checking to see if all hosts have failed and the running result is not ok 13355 1727096153.90722: done checking to see if all hosts have failed 13355 1727096153.90723: getting the next task for host managed_node3 13355 1727096153.90725: done getting next task for host managed_node3 13355 1727096153.90726: ^ task is: None 13355 1727096153.90727: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.90775: in VariableManager get_vars() 13355 1727096153.90809: done with get_vars() 13355 1727096153.90816: in VariableManager get_vars() 13355 1727096153.90837: done with get_vars() 13355 1727096153.90842: variable 'omit' from source: magic vars 13355 1727096153.90879: in VariableManager get_vars() 13355 1727096153.90903: done with get_vars() 13355 1727096153.90926: variable 'omit' from source: magic vars PLAY [Play for testing bond removal] ******************************************* 13355 1727096153.91754: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 13355 1727096153.91779: getting the remaining hosts for this loop 13355 1727096153.91780: done getting the remaining hosts for this loop 13355 1727096153.91782: getting the next task for host managed_node3 13355 1727096153.91784: done getting next task for host managed_node3 13355 1727096153.91785: ^ task is: TASK: Gathering Facts 13355 1727096153.91786: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096153.91787: getting variables 13355 1727096153.91788: in VariableManager get_vars() 13355 1727096153.91801: Calling all_inventory to load vars for managed_node3 13355 1727096153.91803: Calling groups_inventory to load vars for managed_node3 13355 1727096153.91804: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096153.91808: Calling all_plugins_play to load vars for managed_node3 13355 1727096153.91817: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096153.91819: Calling groups_plugins_play to load vars for managed_node3 13355 1727096153.91908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096153.92019: done with get_vars() 13355 1727096153.92025: done getting variables 13355 1727096153.92056: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 Monday 23 September 2024 08:55:53 -0400 (0:00:00.044) 0:00:03.181 ****** 13355 1727096153.92078: entering _queue_task() for managed_node3/gather_facts 13355 1727096153.92310: worker is 1 (out of 1 available) 13355 1727096153.92321: exiting _queue_task() for managed_node3/gather_facts 13355 1727096153.92333: done queuing things up, now waiting for results queue to drain 13355 1727096153.92335: waiting for pending results... 13355 1727096153.92485: running TaskExecutor() for managed_node3/TASK: Gathering Facts 13355 1727096153.92540: in run() - task 0afff68d-5257-c514-593f-000000000218 13355 1727096153.92556: variable 'ansible_search_path' from source: unknown 13355 1727096153.92586: calling self._execute() 13355 1727096153.92648: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.92651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.92661: variable 'omit' from source: magic vars 13355 1727096153.92936: variable 'ansible_distribution_major_version' from source: facts 13355 1727096153.92946: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096153.92954: variable 'omit' from source: magic vars 13355 1727096153.92975: variable 'omit' from source: magic vars 13355 1727096153.93008: variable 'omit' from source: magic vars 13355 1727096153.93037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096153.93065: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096153.93082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096153.93095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096153.93117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096153.93132: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096153.93135: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.93138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.93206: Set connection var ansible_shell_executable to /bin/sh 13355 1727096153.93209: Set connection var ansible_shell_type to sh 13355 1727096153.93215: Set connection var ansible_pipelining to False 13355 1727096153.93226: Set connection var ansible_connection to ssh 13355 1727096153.93232: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096153.93238: Set connection var ansible_timeout to 10 13355 1727096153.93259: variable 'ansible_shell_executable' from source: unknown 13355 1727096153.93262: variable 'ansible_connection' from source: unknown 13355 1727096153.93264: variable 'ansible_module_compression' from source: unknown 13355 1727096153.93269: variable 'ansible_shell_type' from source: unknown 13355 1727096153.93272: variable 'ansible_shell_executable' from source: unknown 13355 1727096153.93274: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096153.93277: variable 'ansible_pipelining' from source: unknown 13355 1727096153.93280: variable 'ansible_timeout' from source: unknown 13355 1727096153.93284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096153.93457: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096153.93647: variable 'omit' from source: magic vars 13355 1727096153.93650: starting attempt loop 13355 1727096153.93655: running the handler 13355 1727096153.93657: variable 'ansible_facts' from source: unknown 13355 1727096153.93658: _low_level_execute_command(): starting 13355 1727096153.93660: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096153.94304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096153.94319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096153.94336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096153.94360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096153.94383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096153.94445: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096153.94489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096153.94502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096153.94522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.94599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096153.97007: stdout chunk (state=3): >>>/root <<< 13355 1727096153.97204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096153.97217: stdout chunk (state=3): >>><<< 13355 1727096153.97232: stderr chunk (state=3): >>><<< 13355 1727096153.97269: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096153.97291: _low_level_execute_command(): starting 13355 1727096153.97302: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568 `" && echo ansible-tmp-1727096153.9727752-13548-274436781541568="` echo /root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568 `" ) && sleep 0' 13355 1727096153.97943: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096153.97961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096153.97978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096153.97996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096153.98118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13355 1727096153.98123: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096153.98153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096153.98249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096154.01151: stdout chunk (state=3): >>>ansible-tmp-1727096153.9727752-13548-274436781541568=/root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568 <<< 13355 1727096154.01390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096154.01394: stdout chunk (state=3): >>><<< 13355 1727096154.01396: stderr chunk (state=3): >>><<< 13355 1727096154.01411: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096153.9727752-13548-274436781541568=/root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096154.01448: variable 'ansible_module_compression' from source: unknown 13355 1727096154.01573: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13355 1727096154.01595: variable 'ansible_facts' from source: unknown 13355 1727096154.01829: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/AnsiballZ_setup.py 13355 1727096154.02091: Sending initial data 13355 1727096154.02105: Sent initial data (154 bytes) 13355 1727096154.02707: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096154.02725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096154.02798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096154.02839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096154.02865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096154.02884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096154.02940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096154.05326: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096154.05365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096154.05399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpiskwrx0u /root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/AnsiballZ_setup.py <<< 13355 1727096154.05409: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/AnsiballZ_setup.py" <<< 13355 1727096154.05441: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpiskwrx0u" to remote "/root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/AnsiballZ_setup.py" <<< 13355 1727096154.05446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/AnsiballZ_setup.py" <<< 13355 1727096154.06481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096154.06521: stderr chunk (state=3): >>><<< 13355 1727096154.06525: stdout chunk (state=3): >>><<< 13355 1727096154.06544: done transferring module to remote 13355 1727096154.06556: _low_level_execute_command(): starting 13355 1727096154.06559: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/ /root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/AnsiballZ_setup.py && sleep 0' 13355 1727096154.07036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096154.07040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096154.07043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096154.07050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096154.07098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096154.07102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096154.07104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096154.07148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096154.09742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096154.09772: stderr chunk (state=3): >>><<< 13355 1727096154.09775: stdout chunk (state=3): >>><<< 13355 1727096154.09792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096154.09795: _low_level_execute_command(): starting 13355 1727096154.09799: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/AnsiballZ_setup.py && sleep 0' 13355 1727096154.10271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096154.10275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096154.10278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096154.10280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096154.10282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096154.10332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096154.10335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096154.10337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096154.10394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096154.94422: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "54", "epoch": "1727096154", "epoch_int": "1727096154", "date": "2024-09-23", "time": "08:55:54", "iso8601_micro": "2024-09-23T12:55:54.548855Z", "iso8601": "2024-09-23T12:55:54Z", "iso8601_basic": "20240923T085554548855", "iso8601_basic_short": "20240923T085554", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2972, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 559, "free": 2972}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 297, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805699072, "block_size": 4096, "block_total": 65519099, "block_available": 63917407, "block_used": 1601692, "inode_total": 131070960, "inode_available": 131029181, "inode_used": 41779, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.44189453125, "5m": 0.48193359375, "15m": 0.23828125}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13355 1727096154.98379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096154.98384: stderr chunk (state=3): >>><<< 13355 1727096154.98386: stdout chunk (state=3): >>><<< 13355 1727096154.98390: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7DqKfC0Ye8uNTSswfdzhsJfWNkZqr/RUstyC1z93+5saW22gxwNSvixV/q/ri8GxqpAsSE+oK2wsR0V0GdogZH271MQEZ63vu1adXuVvwMLN81oGLTzCz50BgmDXlvpEVTodRA7qlbviA7mmwUA/1WP1v1UqCiHmBjaiQT14v9PJhSSiWk1S/2gPlOmfl801arqI6YnvGYHiIop+XESUB4W8ewtJtTJhFqzucGnDCUYA1VAqcSIdjoQaZhZeDXc1gRTzx7QIe3EGJnCnomIPoXNvQk2UgnH08TJGZFoICgqdfZUi+g2uK+PweRJ1TUsmlf86iGOSFYgnCserDeAdpOIuLR5oBE8UNKjxPBW6hJ3It+vPT8mrYLOXrlbt7it9HhxfgYWcqc6ebJyBYNMo0bPdddEpIPiWDXZOoQmZGaGOpSa1K+8hxIvZ9t+jl8b8b8sODYeoVjVaeVXt0i5hoW0CoLWO77c+cGgwJ5+kzXD7eo/SVn+kYjDfKFBCtkJU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ80c6wNB873zUjbHI8VtU557DuKd4APS4WjMTqAOLKKumkxtoQY9gCkk5ZG2HLqdKHBgsFER7nThIQ+11R1mBw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINeHLdc2PHGxhVM3VxIFOOPlFPTEnJXEcPAkPavmyu6v", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-152", "ansible_nodename": "ip-10-31-14-152.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec24182c1ede649ec25b32fe78ed72bb", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Monday", "weekday_number": "1", "weeknumber": "39", "day": "23", "hour": "08", "minute": "55", "second": "54", "epoch": "1727096154", "epoch_int": "1727096154", "date": "2024-09-23", "time": "08:55:54", "iso8601_micro": "2024-09-23T12:55:54.548855Z", "iso8601": "2024-09-23T12:55:54Z", "iso8601_basic": "20240923T085554548855", "iso8601_basic_short": "20240923T085554", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.159 44844 10.31.14.152 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.159 44844 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2972, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 559, "free": 2972}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_uuid": "ec24182c-1ede-649e-c25b-32fe78ed72bb", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 297, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805699072, "block_size": 4096, "block_total": 65519099, "block_available": 63917407, "block_used": 1601692, "inode_total": 131070960, "inode_available": 131029181, "inode_used": 41779, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:dfff:fe7b:8e75", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.152", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:df:7b:8e:75", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.152"], "ansible_all_ipv6_addresses": ["fe80::8ff:dfff:fe7b:8e75"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.152", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:dfff:fe7b:8e75"]}, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.44189453125, "5m": 0.48193359375, "15m": 0.23828125}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096154.99091: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096154.99127: _low_level_execute_command(): starting 13355 1727096154.99172: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096153.9727752-13548-274436781541568/ > /dev/null 2>&1 && sleep 0' 13355 1727096155.00651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096155.00708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096155.00720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096155.00817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096155.00835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096155.00861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096155.01100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096155.03926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096155.03940: stdout chunk (state=3): >>><<< 13355 1727096155.03990: stderr chunk (state=3): >>><<< 13355 1727096155.04005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096155.04280: handler run complete 13355 1727096155.04319: variable 'ansible_facts' from source: unknown 13355 1727096155.04536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096155.05193: variable 'ansible_facts' from source: unknown 13355 1727096155.05403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096155.05674: attempt loop complete, returning result 13355 1727096155.05678: _execute() done 13355 1727096155.05681: dumping result to json 13355 1727096155.05793: done dumping result, returning 13355 1727096155.05891: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [0afff68d-5257-c514-593f-000000000218] 13355 1727096155.05894: sending task result for task 0afff68d-5257-c514-593f-000000000218 13355 1727096155.06779: done sending task result for task 0afff68d-5257-c514-593f-000000000218 13355 1727096155.06784: WORKER PROCESS EXITING ok: [managed_node3] 13355 1727096155.07246: no more pending results, returning what we have 13355 1727096155.07249: results queue empty 13355 1727096155.07250: checking for any_errors_fatal 13355 1727096155.07252: done checking for any_errors_fatal 13355 1727096155.07252: checking for max_fail_percentage 13355 1727096155.07254: done checking for max_fail_percentage 13355 1727096155.07255: checking to see if all hosts have failed and the running result is not ok 13355 1727096155.07255: done checking to see if all hosts have failed 13355 1727096155.07256: getting the remaining hosts for this loop 13355 1727096155.07257: done getting the remaining hosts for this loop 13355 1727096155.07261: getting the next task for host managed_node3 13355 1727096155.07266: done getting next task for host managed_node3 13355 1727096155.07347: ^ task is: TASK: meta (flush_handlers) 13355 1727096155.07351: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096155.07355: getting variables 13355 1727096155.07356: in VariableManager get_vars() 13355 1727096155.07401: Calling all_inventory to load vars for managed_node3 13355 1727096155.07404: Calling groups_inventory to load vars for managed_node3 13355 1727096155.07406: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096155.07416: Calling all_plugins_play to load vars for managed_node3 13355 1727096155.07419: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096155.07538: Calling groups_plugins_play to load vars for managed_node3 13355 1727096155.07840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096155.08369: done with get_vars() 13355 1727096155.08382: done getting variables 13355 1727096155.08456: in VariableManager get_vars() 13355 1727096155.08479: Calling all_inventory to load vars for managed_node3 13355 1727096155.08481: Calling groups_inventory to load vars for managed_node3 13355 1727096155.08484: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096155.08488: Calling all_plugins_play to load vars for managed_node3 13355 1727096155.08491: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096155.08493: Calling groups_plugins_play to load vars for managed_node3 13355 1727096155.08814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096155.09238: done with get_vars() 13355 1727096155.09254: done queuing things up, now waiting for results queue to drain 13355 1727096155.09256: results queue empty 13355 1727096155.09257: checking for any_errors_fatal 13355 1727096155.09261: done checking for any_errors_fatal 13355 1727096155.09262: checking for max_fail_percentage 13355 1727096155.09263: done checking for max_fail_percentage 13355 1727096155.09393: checking to see if all hosts have failed and the running result is not ok 13355 1727096155.09394: done checking to see if all hosts have failed 13355 1727096155.09395: getting the remaining hosts for this loop 13355 1727096155.09396: done getting the remaining hosts for this loop 13355 1727096155.09400: getting the next task for host managed_node3 13355 1727096155.09403: done getting next task for host managed_node3 13355 1727096155.09405: ^ task is: TASK: INIT Prepare setup 13355 1727096155.09407: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096155.09409: getting variables 13355 1727096155.09410: in VariableManager get_vars() 13355 1727096155.09432: Calling all_inventory to load vars for managed_node3 13355 1727096155.09435: Calling groups_inventory to load vars for managed_node3 13355 1727096155.09437: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096155.09442: Calling all_plugins_play to load vars for managed_node3 13355 1727096155.09444: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096155.09446: Calling groups_plugins_play to load vars for managed_node3 13355 1727096155.09695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096155.10220: done with get_vars() 13355 1727096155.10230: done getting variables 13355 1727096155.10433: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:15 Monday 23 September 2024 08:55:55 -0400 (0:00:01.183) 0:00:04.365 ****** 13355 1727096155.10462: entering _queue_task() for managed_node3/debug 13355 1727096155.10463: Creating lock for debug 13355 1727096155.11183: worker is 1 (out of 1 available) 13355 1727096155.11273: exiting _queue_task() for managed_node3/debug 13355 1727096155.11286: done queuing things up, now waiting for results queue to drain 13355 1727096155.11287: waiting for pending results... 13355 1727096155.11787: running TaskExecutor() for managed_node3/TASK: INIT Prepare setup 13355 1727096155.11793: in run() - task 0afff68d-5257-c514-593f-00000000000b 13355 1727096155.11796: variable 'ansible_search_path' from source: unknown 13355 1727096155.12174: calling self._execute() 13355 1727096155.12179: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096155.12183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096155.12186: variable 'omit' from source: magic vars 13355 1727096155.13173: variable 'ansible_distribution_major_version' from source: facts 13355 1727096155.13177: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096155.13180: variable 'omit' from source: magic vars 13355 1727096155.13182: variable 'omit' from source: magic vars 13355 1727096155.13184: variable 'omit' from source: magic vars 13355 1727096155.13220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096155.13266: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096155.13293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096155.13572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096155.13576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096155.13579: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096155.13581: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096155.13583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096155.13660: Set connection var ansible_shell_executable to /bin/sh 13355 1727096155.13973: Set connection var ansible_shell_type to sh 13355 1727096155.13977: Set connection var ansible_pipelining to False 13355 1727096155.13979: Set connection var ansible_connection to ssh 13355 1727096155.13981: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096155.13983: Set connection var ansible_timeout to 10 13355 1727096155.13986: variable 'ansible_shell_executable' from source: unknown 13355 1727096155.13988: variable 'ansible_connection' from source: unknown 13355 1727096155.13989: variable 'ansible_module_compression' from source: unknown 13355 1727096155.13992: variable 'ansible_shell_type' from source: unknown 13355 1727096155.13994: variable 'ansible_shell_executable' from source: unknown 13355 1727096155.13996: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096155.13997: variable 'ansible_pipelining' from source: unknown 13355 1727096155.13999: variable 'ansible_timeout' from source: unknown 13355 1727096155.14001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096155.14122: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096155.14191: variable 'omit' from source: magic vars 13355 1727096155.14251: starting attempt loop 13355 1727096155.14258: running the handler 13355 1727096155.14316: handler run complete 13355 1727096155.14406: attempt loop complete, returning result 13355 1727096155.14477: _execute() done 13355 1727096155.14486: dumping result to json 13355 1727096155.14500: done dumping result, returning 13355 1727096155.14514: done running TaskExecutor() for managed_node3/TASK: INIT Prepare setup [0afff68d-5257-c514-593f-00000000000b] 13355 1727096155.14525: sending task result for task 0afff68d-5257-c514-593f-00000000000b ok: [managed_node3] => {} MSG: ################################################## 13355 1727096155.14679: no more pending results, returning what we have 13355 1727096155.14682: results queue empty 13355 1727096155.14683: checking for any_errors_fatal 13355 1727096155.14686: done checking for any_errors_fatal 13355 1727096155.14687: checking for max_fail_percentage 13355 1727096155.14688: done checking for max_fail_percentage 13355 1727096155.14689: checking to see if all hosts have failed and the running result is not ok 13355 1727096155.14690: done checking to see if all hosts have failed 13355 1727096155.14690: getting the remaining hosts for this loop 13355 1727096155.14692: done getting the remaining hosts for this loop 13355 1727096155.14696: getting the next task for host managed_node3 13355 1727096155.14702: done getting next task for host managed_node3 13355 1727096155.14706: ^ task is: TASK: Install dnsmasq 13355 1727096155.14709: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096155.14714: getting variables 13355 1727096155.14715: in VariableManager get_vars() 13355 1727096155.14776: Calling all_inventory to load vars for managed_node3 13355 1727096155.14778: Calling groups_inventory to load vars for managed_node3 13355 1727096155.14781: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096155.14793: Calling all_plugins_play to load vars for managed_node3 13355 1727096155.14796: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096155.14799: Calling groups_plugins_play to load vars for managed_node3 13355 1727096155.15521: done sending task result for task 0afff68d-5257-c514-593f-00000000000b 13355 1727096155.15525: WORKER PROCESS EXITING 13355 1727096155.15556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096155.16008: done with get_vars() 13355 1727096155.16021: done getting variables 13355 1727096155.16187: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:55:55 -0400 (0:00:00.057) 0:00:04.422 ****** 13355 1727096155.16223: entering _queue_task() for managed_node3/package 13355 1727096155.16917: worker is 1 (out of 1 available) 13355 1727096155.16929: exiting _queue_task() for managed_node3/package 13355 1727096155.16941: done queuing things up, now waiting for results queue to drain 13355 1727096155.16943: waiting for pending results... 13355 1727096155.17528: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 13355 1727096155.17634: in run() - task 0afff68d-5257-c514-593f-00000000000f 13355 1727096155.17647: variable 'ansible_search_path' from source: unknown 13355 1727096155.17651: variable 'ansible_search_path' from source: unknown 13355 1727096155.17795: calling self._execute() 13355 1727096155.17880: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096155.17885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096155.17894: variable 'omit' from source: magic vars 13355 1727096155.18760: variable 'ansible_distribution_major_version' from source: facts 13355 1727096155.18773: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096155.18780: variable 'omit' from source: magic vars 13355 1727096155.18830: variable 'omit' from source: magic vars 13355 1727096155.19676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096155.23674: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096155.23679: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096155.24075: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096155.24080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096155.24082: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096155.24475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096155.24483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096155.24487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096155.24489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096155.24492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096155.24705: variable '__network_is_ostree' from source: set_fact 13355 1727096155.24710: variable 'omit' from source: magic vars 13355 1727096155.24741: variable 'omit' from source: magic vars 13355 1727096155.24775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096155.24804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096155.24821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096155.24837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096155.24847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096155.25084: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096155.25088: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096155.25090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096155.25189: Set connection var ansible_shell_executable to /bin/sh 13355 1727096155.25195: Set connection var ansible_shell_type to sh 13355 1727096155.25201: Set connection var ansible_pipelining to False 13355 1727096155.25206: Set connection var ansible_connection to ssh 13355 1727096155.25211: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096155.25217: Set connection var ansible_timeout to 10 13355 1727096155.25244: variable 'ansible_shell_executable' from source: unknown 13355 1727096155.25247: variable 'ansible_connection' from source: unknown 13355 1727096155.25250: variable 'ansible_module_compression' from source: unknown 13355 1727096155.25252: variable 'ansible_shell_type' from source: unknown 13355 1727096155.25254: variable 'ansible_shell_executable' from source: unknown 13355 1727096155.25261: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096155.25263: variable 'ansible_pipelining' from source: unknown 13355 1727096155.25265: variable 'ansible_timeout' from source: unknown 13355 1727096155.25269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096155.25573: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096155.25581: variable 'omit' from source: magic vars 13355 1727096155.25584: starting attempt loop 13355 1727096155.25587: running the handler 13355 1727096155.25594: variable 'ansible_facts' from source: unknown 13355 1727096155.25597: variable 'ansible_facts' from source: unknown 13355 1727096155.25632: _low_level_execute_command(): starting 13355 1727096155.25639: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096155.27081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096155.27104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096155.27222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096155.27274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096155.27279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096155.27329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096155.27451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096155.29810: stdout chunk (state=3): >>>/root <<< 13355 1727096155.30104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096155.30155: stderr chunk (state=3): >>><<< 13355 1727096155.30159: stdout chunk (state=3): >>><<< 13355 1727096155.30479: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096155.30490: _low_level_execute_command(): starting 13355 1727096155.30493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398 `" && echo ansible-tmp-1727096155.303849-13593-121836144277398="` echo /root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398 `" ) && sleep 0' 13355 1727096155.31536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096155.31541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096155.31659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096155.31673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096155.31807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096155.31873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096155.34639: stdout chunk (state=3): >>>ansible-tmp-1727096155.303849-13593-121836144277398=/root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398 <<< 13355 1727096155.34817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096155.34862: stderr chunk (state=3): >>><<< 13355 1727096155.34874: stdout chunk (state=3): >>><<< 13355 1727096155.35171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096155.303849-13593-121836144277398=/root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096155.35174: variable 'ansible_module_compression' from source: unknown 13355 1727096155.35574: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 13355 1727096155.35578: ANSIBALLZ: Acquiring lock 13355 1727096155.35580: ANSIBALLZ: Lock acquired: 140397099650992 13355 1727096155.35581: ANSIBALLZ: Creating module 13355 1727096155.73104: ANSIBALLZ: Writing module into payload 13355 1727096155.73515: ANSIBALLZ: Writing module 13355 1727096155.73592: ANSIBALLZ: Renaming module 13355 1727096155.73643: ANSIBALLZ: Done creating module 13355 1727096155.73782: variable 'ansible_facts' from source: unknown 13355 1727096155.73998: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/AnsiballZ_dnf.py 13355 1727096155.74281: Sending initial data 13355 1727096155.74293: Sent initial data (151 bytes) 13355 1727096155.75644: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096155.75749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096155.75943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096155.76099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096155.78465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096155.78574: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096155.78602: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpdkzra6gq /root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/AnsiballZ_dnf.py <<< 13355 1727096155.78607: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpdkzra6gq" to remote "/root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/AnsiballZ_dnf.py" <<< 13355 1727096155.80680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096155.80685: stdout chunk (state=3): >>><<< 13355 1727096155.80687: stderr chunk (state=3): >>><<< 13355 1727096155.80689: done transferring module to remote 13355 1727096155.80691: _low_level_execute_command(): starting 13355 1727096155.80694: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/ /root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/AnsiballZ_dnf.py && sleep 0' 13355 1727096155.81723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096155.81812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096155.81909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096155.81927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096155.82100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096155.84995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096155.85000: stdout chunk (state=3): >>><<< 13355 1727096155.85002: stderr chunk (state=3): >>><<< 13355 1727096155.85111: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 13355 1727096155.85115: _low_level_execute_command(): starting 13355 1727096155.85118: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/AnsiballZ_dnf.py && sleep 0' 13355 1727096155.86283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096155.86299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096155.86489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096155.86492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096155.86494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096155.86561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 13355 1727096156.31866: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13355 1727096156.36630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096156.36655: stdout chunk (state=3): >>><<< 13355 1727096156.36658: stderr chunk (state=3): >>><<< 13355 1727096156.36805: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096156.36809: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096156.36819: _low_level_execute_command(): starting 13355 1727096156.36830: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096155.303849-13593-121836144277398/ > /dev/null 2>&1 && sleep 0' 13355 1727096156.37559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096156.37607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096156.37623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096156.37718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096156.37732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.37813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096156.37846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096156.37980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096156.38027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096156.39960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096156.39964: stdout chunk (state=3): >>><<< 13355 1727096156.39976: stderr chunk (state=3): >>><<< 13355 1727096156.39993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096156.40001: handler run complete 13355 1727096156.40196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096156.40393: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096156.40430: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096156.40459: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096156.40674: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096156.40677: variable '__install_status' from source: unknown 13355 1727096156.40679: Evaluated conditional (__install_status is success): True 13355 1727096156.40681: attempt loop complete, returning result 13355 1727096156.40683: _execute() done 13355 1727096156.40685: dumping result to json 13355 1727096156.40687: done dumping result, returning 13355 1727096156.40689: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [0afff68d-5257-c514-593f-00000000000f] 13355 1727096156.40690: sending task result for task 0afff68d-5257-c514-593f-00000000000f 13355 1727096156.40778: done sending task result for task 0afff68d-5257-c514-593f-00000000000f 13355 1727096156.40780: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13355 1727096156.40859: no more pending results, returning what we have 13355 1727096156.40862: results queue empty 13355 1727096156.40863: checking for any_errors_fatal 13355 1727096156.41072: done checking for any_errors_fatal 13355 1727096156.41074: checking for max_fail_percentage 13355 1727096156.41075: done checking for max_fail_percentage 13355 1727096156.41076: checking to see if all hosts have failed and the running result is not ok 13355 1727096156.41076: done checking to see if all hosts have failed 13355 1727096156.41077: getting the remaining hosts for this loop 13355 1727096156.41078: done getting the remaining hosts for this loop 13355 1727096156.41082: getting the next task for host managed_node3 13355 1727096156.41087: done getting next task for host managed_node3 13355 1727096156.41090: ^ task is: TASK: Install pgrep, sysctl 13355 1727096156.41093: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096156.41096: getting variables 13355 1727096156.41097: in VariableManager get_vars() 13355 1727096156.41161: Calling all_inventory to load vars for managed_node3 13355 1727096156.41164: Calling groups_inventory to load vars for managed_node3 13355 1727096156.41166: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096156.41180: Calling all_plugins_play to load vars for managed_node3 13355 1727096156.41187: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096156.41193: Calling groups_plugins_play to load vars for managed_node3 13355 1727096156.41509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096156.41787: done with get_vars() 13355 1727096156.41798: done getting variables 13355 1727096156.41858: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Monday 23 September 2024 08:55:56 -0400 (0:00:01.256) 0:00:05.679 ****** 13355 1727096156.41887: entering _queue_task() for managed_node3/package 13355 1727096156.42208: worker is 1 (out of 1 available) 13355 1727096156.42220: exiting _queue_task() for managed_node3/package 13355 1727096156.42231: done queuing things up, now waiting for results queue to drain 13355 1727096156.42233: waiting for pending results... 13355 1727096156.42512: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 13355 1727096156.42764: in run() - task 0afff68d-5257-c514-593f-000000000010 13355 1727096156.42805: variable 'ansible_search_path' from source: unknown 13355 1727096156.42814: variable 'ansible_search_path' from source: unknown 13355 1727096156.42985: calling self._execute() 13355 1727096156.43126: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096156.43138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096156.43152: variable 'omit' from source: magic vars 13355 1727096156.44114: variable 'ansible_distribution_major_version' from source: facts 13355 1727096156.44183: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096156.44685: variable 'ansible_os_family' from source: facts 13355 1727096156.44737: Evaluated conditional (ansible_os_family == 'RedHat'): True 13355 1727096156.45229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096156.45599: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096156.45693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096156.45729: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096156.45815: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096156.46073: variable 'ansible_distribution_major_version' from source: facts 13355 1727096156.46210: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 13355 1727096156.46216: when evaluation is False, skipping this task 13355 1727096156.46220: _execute() done 13355 1727096156.46223: dumping result to json 13355 1727096156.46225: done dumping result, returning 13355 1727096156.46227: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0afff68d-5257-c514-593f-000000000010] 13355 1727096156.46230: sending task result for task 0afff68d-5257-c514-593f-000000000010 13355 1727096156.46624: done sending task result for task 0afff68d-5257-c514-593f-000000000010 13355 1727096156.46627: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 13355 1727096156.46711: no more pending results, returning what we have 13355 1727096156.46715: results queue empty 13355 1727096156.46716: checking for any_errors_fatal 13355 1727096156.46730: done checking for any_errors_fatal 13355 1727096156.46731: checking for max_fail_percentage 13355 1727096156.46733: done checking for max_fail_percentage 13355 1727096156.46733: checking to see if all hosts have failed and the running result is not ok 13355 1727096156.46734: done checking to see if all hosts have failed 13355 1727096156.46735: getting the remaining hosts for this loop 13355 1727096156.46736: done getting the remaining hosts for this loop 13355 1727096156.46740: getting the next task for host managed_node3 13355 1727096156.46747: done getting next task for host managed_node3 13355 1727096156.46750: ^ task is: TASK: Install pgrep, sysctl 13355 1727096156.46753: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096156.46759: getting variables 13355 1727096156.46760: in VariableManager get_vars() 13355 1727096156.46829: Calling all_inventory to load vars for managed_node3 13355 1727096156.46832: Calling groups_inventory to load vars for managed_node3 13355 1727096156.46836: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096156.46852: Calling all_plugins_play to load vars for managed_node3 13355 1727096156.46857: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096156.46861: Calling groups_plugins_play to load vars for managed_node3 13355 1727096156.47725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096156.47963: done with get_vars() 13355 1727096156.47977: done getting variables 13355 1727096156.48057: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Monday 23 September 2024 08:55:56 -0400 (0:00:00.062) 0:00:05.741 ****** 13355 1727096156.48117: entering _queue_task() for managed_node3/package 13355 1727096156.48697: worker is 1 (out of 1 available) 13355 1727096156.48705: exiting _queue_task() for managed_node3/package 13355 1727096156.48715: done queuing things up, now waiting for results queue to drain 13355 1727096156.48716: waiting for pending results... 13355 1727096156.49147: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 13355 1727096156.49151: in run() - task 0afff68d-5257-c514-593f-000000000011 13355 1727096156.49169: variable 'ansible_search_path' from source: unknown 13355 1727096156.49213: variable 'ansible_search_path' from source: unknown 13355 1727096156.49395: calling self._execute() 13355 1727096156.49545: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096156.49577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096156.49592: variable 'omit' from source: magic vars 13355 1727096156.50460: variable 'ansible_distribution_major_version' from source: facts 13355 1727096156.50517: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096156.50775: variable 'ansible_os_family' from source: facts 13355 1727096156.50941: Evaluated conditional (ansible_os_family == 'RedHat'): True 13355 1727096156.51334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096156.52063: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096156.52144: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096156.52194: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096156.52242: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096156.52386: variable 'ansible_distribution_major_version' from source: facts 13355 1727096156.52459: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 13355 1727096156.52462: variable 'omit' from source: magic vars 13355 1727096156.52507: variable 'omit' from source: magic vars 13355 1727096156.52793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096156.54805: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096156.54852: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096156.54885: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096156.54917: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096156.54941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096156.55017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096156.55038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096156.55058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096156.55087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096156.55100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096156.55239: variable '__network_is_ostree' from source: set_fact 13355 1727096156.55242: variable 'omit' from source: magic vars 13355 1727096156.55278: variable 'omit' from source: magic vars 13355 1727096156.55299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096156.55320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096156.55348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096156.55363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096156.55371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096156.55393: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096156.55407: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096156.55410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096156.55517: Set connection var ansible_shell_executable to /bin/sh 13355 1727096156.55527: Set connection var ansible_shell_type to sh 13355 1727096156.55581: Set connection var ansible_pipelining to False 13355 1727096156.55587: Set connection var ansible_connection to ssh 13355 1727096156.55590: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096156.55592: Set connection var ansible_timeout to 10 13355 1727096156.55600: variable 'ansible_shell_executable' from source: unknown 13355 1727096156.55602: variable 'ansible_connection' from source: unknown 13355 1727096156.55608: variable 'ansible_module_compression' from source: unknown 13355 1727096156.55610: variable 'ansible_shell_type' from source: unknown 13355 1727096156.55612: variable 'ansible_shell_executable' from source: unknown 13355 1727096156.55614: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096156.55616: variable 'ansible_pipelining' from source: unknown 13355 1727096156.55618: variable 'ansible_timeout' from source: unknown 13355 1727096156.55620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096156.55953: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096156.55957: variable 'omit' from source: magic vars 13355 1727096156.55959: starting attempt loop 13355 1727096156.55962: running the handler 13355 1727096156.55997: variable 'ansible_facts' from source: unknown 13355 1727096156.56000: variable 'ansible_facts' from source: unknown 13355 1727096156.56002: _low_level_execute_command(): starting 13355 1727096156.56004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096156.56630: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096156.56635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096156.56637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096156.56640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096156.56642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096156.56644: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096156.56646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.56649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096156.56679: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096156.56682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096156.56684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096156.56686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096156.56688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096156.56690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096156.56856: stderr chunk (state=3): >>>debug2: match found <<< 13355 1727096156.56863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096156.57030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096156.57065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096156.58772: stdout chunk (state=3): >>>/root <<< 13355 1727096156.58865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096156.58909: stderr chunk (state=3): >>><<< 13355 1727096156.58911: stdout chunk (state=3): >>><<< 13355 1727096156.58927: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096156.58941: _low_level_execute_command(): starting 13355 1727096156.58986: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590 `" && echo ansible-tmp-1727096156.5893264-13642-141878131966590="` echo /root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590 `" ) && sleep 0' 13355 1727096156.59470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096156.59476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096156.59480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.59565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096156.59599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096156.59632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096156.61607: stdout chunk (state=3): >>>ansible-tmp-1727096156.5893264-13642-141878131966590=/root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590 <<< 13355 1727096156.61711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096156.61747: stderr chunk (state=3): >>><<< 13355 1727096156.61749: stdout chunk (state=3): >>><<< 13355 1727096156.61762: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096156.5893264-13642-141878131966590=/root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096156.61876: variable 'ansible_module_compression' from source: unknown 13355 1727096156.61879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 13355 1727096156.61881: variable 'ansible_facts' from source: unknown 13355 1727096156.61960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/AnsiballZ_dnf.py 13355 1727096156.62064: Sending initial data 13355 1727096156.62070: Sent initial data (152 bytes) 13355 1727096156.62709: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.62777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096156.62793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.62841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096156.62866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096156.62899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096156.62961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096156.64570: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096156.64598: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096156.64631: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp_ja2q0ec /root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/AnsiballZ_dnf.py <<< 13355 1727096156.64634: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/AnsiballZ_dnf.py" <<< 13355 1727096156.64665: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp_ja2q0ec" to remote "/root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/AnsiballZ_dnf.py" <<< 13355 1727096156.65279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096156.65324: stderr chunk (state=3): >>><<< 13355 1727096156.65328: stdout chunk (state=3): >>><<< 13355 1727096156.65349: done transferring module to remote 13355 1727096156.65360: _low_level_execute_command(): starting 13355 1727096156.65365: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/ /root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/AnsiballZ_dnf.py && sleep 0' 13355 1727096156.65806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096156.65810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096156.65812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.65814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096156.65816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.65858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096156.65865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096156.65889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096156.65922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096156.67744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096156.67772: stderr chunk (state=3): >>><<< 13355 1727096156.67775: stdout chunk (state=3): >>><<< 13355 1727096156.67789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096156.67791: _low_level_execute_command(): starting 13355 1727096156.67797: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/AnsiballZ_dnf.py && sleep 0' 13355 1727096156.68238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096156.68279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096156.68282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.68284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096156.68286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096156.68288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096156.68290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096156.68343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096156.68348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096156.68351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096156.68400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096157.10788: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13355 1727096157.15076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096157.15080: stdout chunk (state=3): >>><<< 13355 1727096157.15082: stderr chunk (state=3): >>><<< 13355 1727096157.15085: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096157.15088: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096157.15091: _low_level_execute_command(): starting 13355 1727096157.15093: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096156.5893264-13642-141878131966590/ > /dev/null 2>&1 && sleep 0' 13355 1727096157.16284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096157.16288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096157.16291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096157.16293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096157.16295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096157.16358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096157.16366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096157.16424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096157.18574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096157.18578: stdout chunk (state=3): >>><<< 13355 1727096157.18580: stderr chunk (state=3): >>><<< 13355 1727096157.18583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096157.18585: handler run complete 13355 1727096157.18587: attempt loop complete, returning result 13355 1727096157.18589: _execute() done 13355 1727096157.18591: dumping result to json 13355 1727096157.18593: done dumping result, returning 13355 1727096157.18595: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [0afff68d-5257-c514-593f-000000000011] 13355 1727096157.18597: sending task result for task 0afff68d-5257-c514-593f-000000000011 13355 1727096157.18676: done sending task result for task 0afff68d-5257-c514-593f-000000000011 13355 1727096157.18680: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13355 1727096157.18755: no more pending results, returning what we have 13355 1727096157.18759: results queue empty 13355 1727096157.18760: checking for any_errors_fatal 13355 1727096157.18771: done checking for any_errors_fatal 13355 1727096157.18772: checking for max_fail_percentage 13355 1727096157.18774: done checking for max_fail_percentage 13355 1727096157.18774: checking to see if all hosts have failed and the running result is not ok 13355 1727096157.18775: done checking to see if all hosts have failed 13355 1727096157.18776: getting the remaining hosts for this loop 13355 1727096157.18777: done getting the remaining hosts for this loop 13355 1727096157.18781: getting the next task for host managed_node3 13355 1727096157.18788: done getting next task for host managed_node3 13355 1727096157.18790: ^ task is: TASK: Create test interfaces 13355 1727096157.18793: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096157.18801: getting variables 13355 1727096157.18803: in VariableManager get_vars() 13355 1727096157.18857: Calling all_inventory to load vars for managed_node3 13355 1727096157.19130: Calling groups_inventory to load vars for managed_node3 13355 1727096157.19134: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096157.19144: Calling all_plugins_play to load vars for managed_node3 13355 1727096157.19148: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096157.19151: Calling groups_plugins_play to load vars for managed_node3 13355 1727096157.19335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096157.19551: done with get_vars() 13355 1727096157.19563: done getting variables 13355 1727096157.19658: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Monday 23 September 2024 08:55:57 -0400 (0:00:00.715) 0:00:06.457 ****** 13355 1727096157.19693: entering _queue_task() for managed_node3/shell 13355 1727096157.19695: Creating lock for shell 13355 1727096157.19963: worker is 1 (out of 1 available) 13355 1727096157.19977: exiting _queue_task() for managed_node3/shell 13355 1727096157.19989: done queuing things up, now waiting for results queue to drain 13355 1727096157.19990: waiting for pending results... 13355 1727096157.20383: running TaskExecutor() for managed_node3/TASK: Create test interfaces 13355 1727096157.20387: in run() - task 0afff68d-5257-c514-593f-000000000012 13355 1727096157.20390: variable 'ansible_search_path' from source: unknown 13355 1727096157.20393: variable 'ansible_search_path' from source: unknown 13355 1727096157.20396: calling self._execute() 13355 1727096157.20470: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096157.20481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096157.20495: variable 'omit' from source: magic vars 13355 1727096157.20837: variable 'ansible_distribution_major_version' from source: facts 13355 1727096157.20855: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096157.20866: variable 'omit' from source: magic vars 13355 1727096157.20913: variable 'omit' from source: magic vars 13355 1727096157.21403: variable 'dhcp_interface1' from source: play vars 13355 1727096157.21673: variable 'dhcp_interface2' from source: play vars 13355 1727096157.21677: variable 'omit' from source: magic vars 13355 1727096157.21793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096157.21797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096157.21800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096157.21802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096157.21839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096157.21907: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096157.22176: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096157.22180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096157.22195: Set connection var ansible_shell_executable to /bin/sh 13355 1727096157.22211: Set connection var ansible_shell_type to sh 13355 1727096157.22224: Set connection var ansible_pipelining to False 13355 1727096157.22235: Set connection var ansible_connection to ssh 13355 1727096157.22246: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096157.22256: Set connection var ansible_timeout to 10 13355 1727096157.22312: variable 'ansible_shell_executable' from source: unknown 13355 1727096157.22402: variable 'ansible_connection' from source: unknown 13355 1727096157.22409: variable 'ansible_module_compression' from source: unknown 13355 1727096157.22417: variable 'ansible_shell_type' from source: unknown 13355 1727096157.22425: variable 'ansible_shell_executable' from source: unknown 13355 1727096157.22432: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096157.22439: variable 'ansible_pipelining' from source: unknown 13355 1727096157.22446: variable 'ansible_timeout' from source: unknown 13355 1727096157.22454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096157.22629: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096157.22642: variable 'omit' from source: magic vars 13355 1727096157.22650: starting attempt loop 13355 1727096157.22655: running the handler 13355 1727096157.22668: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096157.22689: _low_level_execute_command(): starting 13355 1727096157.22699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096157.23906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096157.23986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096157.24034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096157.24051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096157.24079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096157.24138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096157.26178: stdout chunk (state=3): >>>/root <<< 13355 1727096157.26184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096157.26187: stdout chunk (state=3): >>><<< 13355 1727096157.26189: stderr chunk (state=3): >>><<< 13355 1727096157.26192: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096157.26195: _low_level_execute_command(): starting 13355 1727096157.26197: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734 `" && echo ansible-tmp-1727096157.2610102-13679-201690896015734="` echo /root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734 `" ) && sleep 0' 13355 1727096157.27313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096157.27335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096157.27548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096157.27574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096157.27611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096157.29529: stdout chunk (state=3): >>>ansible-tmp-1727096157.2610102-13679-201690896015734=/root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734 <<< 13355 1727096157.29685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096157.29689: stdout chunk (state=3): >>><<< 13355 1727096157.29692: stderr chunk (state=3): >>><<< 13355 1727096157.29873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096157.2610102-13679-201690896015734=/root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096157.29876: variable 'ansible_module_compression' from source: unknown 13355 1727096157.29879: ANSIBALLZ: Using generic lock for ansible.legacy.command 13355 1727096157.29881: ANSIBALLZ: Acquiring lock 13355 1727096157.29883: ANSIBALLZ: Lock acquired: 140397099650992 13355 1727096157.29885: ANSIBALLZ: Creating module 13355 1727096157.45597: ANSIBALLZ: Writing module into payload 13355 1727096157.45680: ANSIBALLZ: Writing module 13355 1727096157.45697: ANSIBALLZ: Renaming module 13355 1727096157.45712: ANSIBALLZ: Done creating module 13355 1727096157.45730: variable 'ansible_facts' from source: unknown 13355 1727096157.45854: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/AnsiballZ_command.py 13355 1727096157.45991: Sending initial data 13355 1727096157.45994: Sent initial data (156 bytes) 13355 1727096157.46506: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096157.46517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096157.46526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096157.46540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096157.46551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096157.46562: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096157.46688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096157.46758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096157.46820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096157.48431: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13355 1727096157.48438: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13355 1727096157.48447: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096157.48501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096157.48561: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp32q7tx0r /root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/AnsiballZ_command.py <<< 13355 1727096157.48577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/AnsiballZ_command.py" <<< 13355 1727096157.48635: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp32q7tx0r" to remote "/root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/AnsiballZ_command.py" <<< 13355 1727096157.49470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096157.49649: stderr chunk (state=3): >>><<< 13355 1727096157.49652: stdout chunk (state=3): >>><<< 13355 1727096157.49688: done transferring module to remote 13355 1727096157.49703: _low_level_execute_command(): starting 13355 1727096157.49708: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/ /root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/AnsiballZ_command.py && sleep 0' 13355 1727096157.50562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096157.50660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096157.50784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096157.50810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096157.50908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096157.52818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096157.52822: stdout chunk (state=3): >>><<< 13355 1727096157.52824: stderr chunk (state=3): >>><<< 13355 1727096157.52830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096157.52846: _low_level_execute_command(): starting 13355 1727096157.52850: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/AnsiballZ_command.py && sleep 0' 13355 1727096157.53554: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096157.53675: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096157.53704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096157.53783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096158.91997: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-23 08:55:57.689607", "end": "2024-09-23 08:55:58.911582", "delta": "0:00:01.221975", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096158.93575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096158.93581: stdout chunk (state=3): >>><<< 13355 1727096158.93584: stderr chunk (state=3): >>><<< 13355 1727096158.93587: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-23 08:55:57.689607", "end": "2024-09-23 08:55:58.911582", "delta": "0:00:01.221975", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096158.93596: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096158.93599: _low_level_execute_command(): starting 13355 1727096158.93679: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096157.2610102-13679-201690896015734/ > /dev/null 2>&1 && sleep 0' 13355 1727096158.94988: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096158.94994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096158.95193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096158.95197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096158.95228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096158.95266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096158.95384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096158.97261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096158.97278: stdout chunk (state=3): >>><<< 13355 1727096158.97290: stderr chunk (state=3): >>><<< 13355 1727096158.97312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096158.97325: handler run complete 13355 1727096158.97357: Evaluated conditional (False): False 13355 1727096158.97378: attempt loop complete, returning result 13355 1727096158.97386: _execute() done 13355 1727096158.97393: dumping result to json 13355 1727096158.97404: done dumping result, returning 13355 1727096158.97472: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [0afff68d-5257-c514-593f-000000000012] 13355 1727096158.97475: sending task result for task 0afff68d-5257-c514-593f-000000000012 13355 1727096158.97779: done sending task result for task 0afff68d-5257-c514-593f-000000000012 13355 1727096158.97782: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.221975", "end": "2024-09-23 08:55:58.911582", "rc": 0, "start": "2024-09-23 08:55:57.689607" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 13355 1727096158.97877: no more pending results, returning what we have 13355 1727096158.97881: results queue empty 13355 1727096158.97882: checking for any_errors_fatal 13355 1727096158.97890: done checking for any_errors_fatal 13355 1727096158.97891: checking for max_fail_percentage 13355 1727096158.97893: done checking for max_fail_percentage 13355 1727096158.97894: checking to see if all hosts have failed and the running result is not ok 13355 1727096158.97894: done checking to see if all hosts have failed 13355 1727096158.97895: getting the remaining hosts for this loop 13355 1727096158.97896: done getting the remaining hosts for this loop 13355 1727096158.97902: getting the next task for host managed_node3 13355 1727096158.97911: done getting next task for host managed_node3 13355 1727096158.97913: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13355 1727096158.97917: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096158.97920: getting variables 13355 1727096158.97922: in VariableManager get_vars() 13355 1727096158.98142: Calling all_inventory to load vars for managed_node3 13355 1727096158.98145: Calling groups_inventory to load vars for managed_node3 13355 1727096158.98148: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096158.98160: Calling all_plugins_play to load vars for managed_node3 13355 1727096158.98163: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096158.98166: Calling groups_plugins_play to load vars for managed_node3 13355 1727096158.98337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096158.98977: done with get_vars() 13355 1727096158.98990: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:58 -0400 (0:00:01.795) 0:00:08.253 ****** 13355 1727096158.99290: entering _queue_task() for managed_node3/include_tasks 13355 1727096158.99725: worker is 1 (out of 1 available) 13355 1727096158.99737: exiting _queue_task() for managed_node3/include_tasks 13355 1727096158.99750: done queuing things up, now waiting for results queue to drain 13355 1727096158.99751: waiting for pending results... 13355 1727096158.99958: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 13355 1727096159.00274: in run() - task 0afff68d-5257-c514-593f-000000000016 13355 1727096159.00279: variable 'ansible_search_path' from source: unknown 13355 1727096159.00282: variable 'ansible_search_path' from source: unknown 13355 1727096159.00286: calling self._execute() 13355 1727096159.00362: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.00377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.00392: variable 'omit' from source: magic vars 13355 1727096159.00820: variable 'ansible_distribution_major_version' from source: facts 13355 1727096159.00844: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096159.00858: _execute() done 13355 1727096159.00869: dumping result to json 13355 1727096159.00879: done dumping result, returning 13355 1727096159.00891: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-c514-593f-000000000016] 13355 1727096159.00904: sending task result for task 0afff68d-5257-c514-593f-000000000016 13355 1727096159.01023: done sending task result for task 0afff68d-5257-c514-593f-000000000016 13355 1727096159.01080: no more pending results, returning what we have 13355 1727096159.01085: in VariableManager get_vars() 13355 1727096159.01157: Calling all_inventory to load vars for managed_node3 13355 1727096159.01161: Calling groups_inventory to load vars for managed_node3 13355 1727096159.01164: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096159.01180: Calling all_plugins_play to load vars for managed_node3 13355 1727096159.01184: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096159.01187: Calling groups_plugins_play to load vars for managed_node3 13355 1727096159.01665: WORKER PROCESS EXITING 13355 1727096159.01687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096159.01902: done with get_vars() 13355 1727096159.01911: variable 'ansible_search_path' from source: unknown 13355 1727096159.01912: variable 'ansible_search_path' from source: unknown 13355 1727096159.02018: we have included files to process 13355 1727096159.02019: generating all_blocks data 13355 1727096159.02021: done generating all_blocks data 13355 1727096159.02021: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096159.02023: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096159.02030: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096159.02281: done processing included file 13355 1727096159.02283: iterating over new_blocks loaded from include file 13355 1727096159.02285: in VariableManager get_vars() 13355 1727096159.02312: done with get_vars() 13355 1727096159.02313: filtering new block on tags 13355 1727096159.02330: done filtering new block on tags 13355 1727096159.02333: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 13355 1727096159.02339: extending task lists for all hosts with included blocks 13355 1727096159.02444: done extending task lists 13355 1727096159.02445: done processing included files 13355 1727096159.02446: results queue empty 13355 1727096159.02447: checking for any_errors_fatal 13355 1727096159.02457: done checking for any_errors_fatal 13355 1727096159.02458: checking for max_fail_percentage 13355 1727096159.02459: done checking for max_fail_percentage 13355 1727096159.02460: checking to see if all hosts have failed and the running result is not ok 13355 1727096159.02466: done checking to see if all hosts have failed 13355 1727096159.02468: getting the remaining hosts for this loop 13355 1727096159.02470: done getting the remaining hosts for this loop 13355 1727096159.02473: getting the next task for host managed_node3 13355 1727096159.02477: done getting next task for host managed_node3 13355 1727096159.02479: ^ task is: TASK: Get stat for interface {{ interface }} 13355 1727096159.02481: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096159.02484: getting variables 13355 1727096159.02485: in VariableManager get_vars() 13355 1727096159.02505: Calling all_inventory to load vars for managed_node3 13355 1727096159.02507: Calling groups_inventory to load vars for managed_node3 13355 1727096159.02509: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096159.02514: Calling all_plugins_play to load vars for managed_node3 13355 1727096159.02517: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096159.02520: Calling groups_plugins_play to load vars for managed_node3 13355 1727096159.02795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096159.03032: done with get_vars() 13355 1727096159.03041: done getting variables 13355 1727096159.03205: variable 'interface' from source: task vars 13355 1727096159.03211: variable 'dhcp_interface1' from source: play vars 13355 1727096159.03275: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:55:59 -0400 (0:00:00.040) 0:00:08.293 ****** 13355 1727096159.03318: entering _queue_task() for managed_node3/stat 13355 1727096159.03624: worker is 1 (out of 1 available) 13355 1727096159.03748: exiting _queue_task() for managed_node3/stat 13355 1727096159.03761: done queuing things up, now waiting for results queue to drain 13355 1727096159.03762: waiting for pending results... 13355 1727096159.03916: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 13355 1727096159.04041: in run() - task 0afff68d-5257-c514-593f-000000000248 13355 1727096159.04062: variable 'ansible_search_path' from source: unknown 13355 1727096159.04077: variable 'ansible_search_path' from source: unknown 13355 1727096159.04118: calling self._execute() 13355 1727096159.04210: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.04221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.04236: variable 'omit' from source: magic vars 13355 1727096159.04718: variable 'ansible_distribution_major_version' from source: facts 13355 1727096159.04722: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096159.04724: variable 'omit' from source: magic vars 13355 1727096159.04736: variable 'omit' from source: magic vars 13355 1727096159.04844: variable 'interface' from source: task vars 13355 1727096159.04862: variable 'dhcp_interface1' from source: play vars 13355 1727096159.04933: variable 'dhcp_interface1' from source: play vars 13355 1727096159.04966: variable 'omit' from source: magic vars 13355 1727096159.05011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096159.05071: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096159.05156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096159.05160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096159.05163: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096159.05166: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096159.05170: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.05179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.05290: Set connection var ansible_shell_executable to /bin/sh 13355 1727096159.05303: Set connection var ansible_shell_type to sh 13355 1727096159.05313: Set connection var ansible_pipelining to False 13355 1727096159.05323: Set connection var ansible_connection to ssh 13355 1727096159.05333: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096159.05373: Set connection var ansible_timeout to 10 13355 1727096159.05380: variable 'ansible_shell_executable' from source: unknown 13355 1727096159.05393: variable 'ansible_connection' from source: unknown 13355 1727096159.05402: variable 'ansible_module_compression' from source: unknown 13355 1727096159.05409: variable 'ansible_shell_type' from source: unknown 13355 1727096159.05472: variable 'ansible_shell_executable' from source: unknown 13355 1727096159.05673: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.05676: variable 'ansible_pipelining' from source: unknown 13355 1727096159.05678: variable 'ansible_timeout' from source: unknown 13355 1727096159.05680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.05839: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096159.05849: variable 'omit' from source: magic vars 13355 1727096159.05857: starting attempt loop 13355 1727096159.05860: running the handler 13355 1727096159.05875: _low_level_execute_command(): starting 13355 1727096159.05883: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096159.06776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.06884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.06894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.06924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.08605: stdout chunk (state=3): >>>/root <<< 13355 1727096159.08740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.08765: stdout chunk (state=3): >>><<< 13355 1727096159.08787: stderr chunk (state=3): >>><<< 13355 1727096159.08815: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096159.08837: _low_level_execute_command(): starting 13355 1727096159.08849: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203 `" && echo ansible-tmp-1727096159.0882313-13763-9817169027203="` echo /root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203 `" ) && sleep 0' 13355 1727096159.09481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096159.09497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096159.09522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096159.09544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096159.09562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096159.09641: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.09684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.09698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.09716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.09785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.11785: stdout chunk (state=3): >>>ansible-tmp-1727096159.0882313-13763-9817169027203=/root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203 <<< 13355 1727096159.11946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.11950: stdout chunk (state=3): >>><<< 13355 1727096159.11956: stderr chunk (state=3): >>><<< 13355 1727096159.12174: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096159.0882313-13763-9817169027203=/root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096159.12178: variable 'ansible_module_compression' from source: unknown 13355 1727096159.12181: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13355 1727096159.12183: variable 'ansible_facts' from source: unknown 13355 1727096159.12229: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/AnsiballZ_stat.py 13355 1727096159.12428: Sending initial data 13355 1727096159.12437: Sent initial data (151 bytes) 13355 1727096159.13124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096159.13141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096159.13160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096159.13275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.13305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.13382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.15027: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096159.15090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096159.15161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpyt6r78gh /root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/AnsiballZ_stat.py <<< 13355 1727096159.15181: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/AnsiballZ_stat.py" <<< 13355 1727096159.15204: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpyt6r78gh" to remote "/root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/AnsiballZ_stat.py" <<< 13355 1727096159.16023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.16027: stdout chunk (state=3): >>><<< 13355 1727096159.16029: stderr chunk (state=3): >>><<< 13355 1727096159.16031: done transferring module to remote 13355 1727096159.16034: _low_level_execute_command(): starting 13355 1727096159.16040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/ /root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/AnsiballZ_stat.py && sleep 0' 13355 1727096159.16681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096159.16698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096159.16713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096159.16732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096159.16794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.16861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.16883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.16918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.17003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.18982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.18986: stdout chunk (state=3): >>><<< 13355 1727096159.18989: stderr chunk (state=3): >>><<< 13355 1727096159.19105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096159.19113: _low_level_execute_command(): starting 13355 1727096159.19116: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/AnsiballZ_stat.py && sleep 0' 13355 1727096159.19793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.19833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.19850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.19901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.20086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.36454: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28768, "dev": 23, "nlink": 1, "atime": 1727096157.6961663, "mtime": 1727096157.6961663, "ctime": 1727096157.6961663, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13355 1727096159.37611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096159.37616: stdout chunk (state=3): >>><<< 13355 1727096159.37624: stderr chunk (state=3): >>><<< 13355 1727096159.37644: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28768, "dev": 23, "nlink": 1, "atime": 1727096157.6961663, "mtime": 1727096157.6961663, "ctime": 1727096157.6961663, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096159.37703: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096159.37712: _low_level_execute_command(): starting 13355 1727096159.37718: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096159.0882313-13763-9817169027203/ > /dev/null 2>&1 && sleep 0' 13355 1727096159.39083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096159.39088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.39108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096159.39112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096159.39127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096159.39133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.39292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.39296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.39364: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.39384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.41518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.41581: stderr chunk (state=3): >>><<< 13355 1727096159.41591: stdout chunk (state=3): >>><<< 13355 1727096159.41615: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096159.41630: handler run complete 13355 1727096159.41684: attempt loop complete, returning result 13355 1727096159.41691: _execute() done 13355 1727096159.41698: dumping result to json 13355 1727096159.41707: done dumping result, returning 13355 1727096159.41719: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [0afff68d-5257-c514-593f-000000000248] 13355 1727096159.41732: sending task result for task 0afff68d-5257-c514-593f-000000000248 13355 1727096159.42080: done sending task result for task 0afff68d-5257-c514-593f-000000000248 13355 1727096159.42084: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096157.6961663, "block_size": 4096, "blocks": 0, "ctime": 1727096157.6961663, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28768, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727096157.6961663, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13355 1727096159.42165: no more pending results, returning what we have 13355 1727096159.42172: results queue empty 13355 1727096159.42173: checking for any_errors_fatal 13355 1727096159.42174: done checking for any_errors_fatal 13355 1727096159.42175: checking for max_fail_percentage 13355 1727096159.42176: done checking for max_fail_percentage 13355 1727096159.42177: checking to see if all hosts have failed and the running result is not ok 13355 1727096159.42178: done checking to see if all hosts have failed 13355 1727096159.42178: getting the remaining hosts for this loop 13355 1727096159.42180: done getting the remaining hosts for this loop 13355 1727096159.42184: getting the next task for host managed_node3 13355 1727096159.42192: done getting next task for host managed_node3 13355 1727096159.42194: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13355 1727096159.42197: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096159.42202: getting variables 13355 1727096159.42204: in VariableManager get_vars() 13355 1727096159.42263: Calling all_inventory to load vars for managed_node3 13355 1727096159.42266: Calling groups_inventory to load vars for managed_node3 13355 1727096159.42276: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096159.42289: Calling all_plugins_play to load vars for managed_node3 13355 1727096159.42291: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096159.42294: Calling groups_plugins_play to load vars for managed_node3 13355 1727096159.42633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096159.42877: done with get_vars() 13355 1727096159.42890: done getting variables 13355 1727096159.42997: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 13355 1727096159.43337: variable 'interface' from source: task vars 13355 1727096159.43342: variable 'dhcp_interface1' from source: play vars 13355 1727096159.43520: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:59 -0400 (0:00:00.402) 0:00:08.696 ****** 13355 1727096159.43554: entering _queue_task() for managed_node3/assert 13355 1727096159.43561: Creating lock for assert 13355 1727096159.44250: worker is 1 (out of 1 available) 13355 1727096159.44263: exiting _queue_task() for managed_node3/assert 13355 1727096159.44314: done queuing things up, now waiting for results queue to drain 13355 1727096159.44316: waiting for pending results... 13355 1727096159.44545: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 13355 1727096159.44670: in run() - task 0afff68d-5257-c514-593f-000000000017 13355 1727096159.44693: variable 'ansible_search_path' from source: unknown 13355 1727096159.44702: variable 'ansible_search_path' from source: unknown 13355 1727096159.44859: calling self._execute() 13355 1727096159.44863: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.44866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.44876: variable 'omit' from source: magic vars 13355 1727096159.45328: variable 'ansible_distribution_major_version' from source: facts 13355 1727096159.45344: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096159.45356: variable 'omit' from source: magic vars 13355 1727096159.45414: variable 'omit' from source: magic vars 13355 1727096159.45521: variable 'interface' from source: task vars 13355 1727096159.45530: variable 'dhcp_interface1' from source: play vars 13355 1727096159.45593: variable 'dhcp_interface1' from source: play vars 13355 1727096159.45625: variable 'omit' from source: magic vars 13355 1727096159.45666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096159.45705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096159.45732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096159.45754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096159.45773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096159.45808: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096159.45818: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.45826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.45937: Set connection var ansible_shell_executable to /bin/sh 13355 1727096159.46057: Set connection var ansible_shell_type to sh 13355 1727096159.46061: Set connection var ansible_pipelining to False 13355 1727096159.46063: Set connection var ansible_connection to ssh 13355 1727096159.46065: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096159.46070: Set connection var ansible_timeout to 10 13355 1727096159.46072: variable 'ansible_shell_executable' from source: unknown 13355 1727096159.46074: variable 'ansible_connection' from source: unknown 13355 1727096159.46075: variable 'ansible_module_compression' from source: unknown 13355 1727096159.46077: variable 'ansible_shell_type' from source: unknown 13355 1727096159.46079: variable 'ansible_shell_executable' from source: unknown 13355 1727096159.46081: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.46083: variable 'ansible_pipelining' from source: unknown 13355 1727096159.46085: variable 'ansible_timeout' from source: unknown 13355 1727096159.46087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.46275: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096159.46279: variable 'omit' from source: magic vars 13355 1727096159.46281: starting attempt loop 13355 1727096159.46283: running the handler 13355 1727096159.46382: variable 'interface_stat' from source: set_fact 13355 1727096159.46406: Evaluated conditional (interface_stat.stat.exists): True 13355 1727096159.46422: handler run complete 13355 1727096159.46446: attempt loop complete, returning result 13355 1727096159.46455: _execute() done 13355 1727096159.46491: dumping result to json 13355 1727096159.46494: done dumping result, returning 13355 1727096159.46497: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [0afff68d-5257-c514-593f-000000000017] 13355 1727096159.46499: sending task result for task 0afff68d-5257-c514-593f-000000000017 13355 1727096159.46834: done sending task result for task 0afff68d-5257-c514-593f-000000000017 13355 1727096159.46837: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096159.46890: no more pending results, returning what we have 13355 1727096159.46894: results queue empty 13355 1727096159.46895: checking for any_errors_fatal 13355 1727096159.46905: done checking for any_errors_fatal 13355 1727096159.46906: checking for max_fail_percentage 13355 1727096159.46907: done checking for max_fail_percentage 13355 1727096159.46908: checking to see if all hosts have failed and the running result is not ok 13355 1727096159.46909: done checking to see if all hosts have failed 13355 1727096159.46910: getting the remaining hosts for this loop 13355 1727096159.46911: done getting the remaining hosts for this loop 13355 1727096159.46915: getting the next task for host managed_node3 13355 1727096159.46929: done getting next task for host managed_node3 13355 1727096159.46931: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13355 1727096159.46934: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096159.46939: getting variables 13355 1727096159.46940: in VariableManager get_vars() 13355 1727096159.46997: Calling all_inventory to load vars for managed_node3 13355 1727096159.46999: Calling groups_inventory to load vars for managed_node3 13355 1727096159.47002: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096159.47014: Calling all_plugins_play to load vars for managed_node3 13355 1727096159.47017: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096159.47021: Calling groups_plugins_play to load vars for managed_node3 13355 1727096159.47418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096159.47854: done with get_vars() 13355 1727096159.47866: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:55:59 -0400 (0:00:00.045) 0:00:08.742 ****** 13355 1727096159.48148: entering _queue_task() for managed_node3/include_tasks 13355 1727096159.48727: worker is 1 (out of 1 available) 13355 1727096159.48740: exiting _queue_task() for managed_node3/include_tasks 13355 1727096159.48753: done queuing things up, now waiting for results queue to drain 13355 1727096159.48754: waiting for pending results... 13355 1727096159.49300: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 13355 1727096159.49414: in run() - task 0afff68d-5257-c514-593f-00000000001b 13355 1727096159.49437: variable 'ansible_search_path' from source: unknown 13355 1727096159.49445: variable 'ansible_search_path' from source: unknown 13355 1727096159.49496: calling self._execute() 13355 1727096159.49594: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.49774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.49777: variable 'omit' from source: magic vars 13355 1727096159.50028: variable 'ansible_distribution_major_version' from source: facts 13355 1727096159.50047: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096159.50062: _execute() done 13355 1727096159.50072: dumping result to json 13355 1727096159.50081: done dumping result, returning 13355 1727096159.50092: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-c514-593f-00000000001b] 13355 1727096159.50101: sending task result for task 0afff68d-5257-c514-593f-00000000001b 13355 1727096159.50305: no more pending results, returning what we have 13355 1727096159.50311: in VariableManager get_vars() 13355 1727096159.50386: Calling all_inventory to load vars for managed_node3 13355 1727096159.50390: Calling groups_inventory to load vars for managed_node3 13355 1727096159.50392: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096159.50407: Calling all_plugins_play to load vars for managed_node3 13355 1727096159.50411: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096159.50414: Calling groups_plugins_play to load vars for managed_node3 13355 1727096159.50887: done sending task result for task 0afff68d-5257-c514-593f-00000000001b 13355 1727096159.50892: WORKER PROCESS EXITING 13355 1727096159.51031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096159.51542: done with get_vars() 13355 1727096159.51551: variable 'ansible_search_path' from source: unknown 13355 1727096159.51552: variable 'ansible_search_path' from source: unknown 13355 1727096159.51704: we have included files to process 13355 1727096159.51705: generating all_blocks data 13355 1727096159.51707: done generating all_blocks data 13355 1727096159.51710: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096159.51711: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096159.51713: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096159.52131: done processing included file 13355 1727096159.52133: iterating over new_blocks loaded from include file 13355 1727096159.52135: in VariableManager get_vars() 13355 1727096159.52172: done with get_vars() 13355 1727096159.52174: filtering new block on tags 13355 1727096159.52192: done filtering new block on tags 13355 1727096159.52194: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 13355 1727096159.52200: extending task lists for all hosts with included blocks 13355 1727096159.52561: done extending task lists 13355 1727096159.52563: done processing included files 13355 1727096159.52564: results queue empty 13355 1727096159.52564: checking for any_errors_fatal 13355 1727096159.52584: done checking for any_errors_fatal 13355 1727096159.52585: checking for max_fail_percentage 13355 1727096159.52586: done checking for max_fail_percentage 13355 1727096159.52587: checking to see if all hosts have failed and the running result is not ok 13355 1727096159.52588: done checking to see if all hosts have failed 13355 1727096159.52589: getting the remaining hosts for this loop 13355 1727096159.52590: done getting the remaining hosts for this loop 13355 1727096159.52593: getting the next task for host managed_node3 13355 1727096159.52597: done getting next task for host managed_node3 13355 1727096159.52600: ^ task is: TASK: Get stat for interface {{ interface }} 13355 1727096159.52602: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096159.52605: getting variables 13355 1727096159.52606: in VariableManager get_vars() 13355 1727096159.52627: Calling all_inventory to load vars for managed_node3 13355 1727096159.52631: Calling groups_inventory to load vars for managed_node3 13355 1727096159.52633: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096159.52653: Calling all_plugins_play to load vars for managed_node3 13355 1727096159.52656: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096159.52660: Calling groups_plugins_play to load vars for managed_node3 13355 1727096159.52832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096159.53034: done with get_vars() 13355 1727096159.53045: done getting variables 13355 1727096159.53225: variable 'interface' from source: task vars 13355 1727096159.53229: variable 'dhcp_interface2' from source: play vars 13355 1727096159.53288: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:55:59 -0400 (0:00:00.051) 0:00:08.793 ****** 13355 1727096159.53326: entering _queue_task() for managed_node3/stat 13355 1727096159.53882: worker is 1 (out of 1 available) 13355 1727096159.53889: exiting _queue_task() for managed_node3/stat 13355 1727096159.53899: done queuing things up, now waiting for results queue to drain 13355 1727096159.53901: waiting for pending results... 13355 1727096159.54276: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 13355 1727096159.54474: in run() - task 0afff68d-5257-c514-593f-000000000260 13355 1727096159.54478: variable 'ansible_search_path' from source: unknown 13355 1727096159.54482: variable 'ansible_search_path' from source: unknown 13355 1727096159.54484: calling self._execute() 13355 1727096159.54572: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.54585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.54600: variable 'omit' from source: magic vars 13355 1727096159.54975: variable 'ansible_distribution_major_version' from source: facts 13355 1727096159.55075: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096159.55088: variable 'omit' from source: magic vars 13355 1727096159.55145: variable 'omit' from source: magic vars 13355 1727096159.55274: variable 'interface' from source: task vars 13355 1727096159.55381: variable 'dhcp_interface2' from source: play vars 13355 1727096159.55385: variable 'dhcp_interface2' from source: play vars 13355 1727096159.55388: variable 'omit' from source: magic vars 13355 1727096159.55424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096159.55471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096159.55504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096159.55527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096159.55544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096159.55586: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096159.55600: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.55609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.55723: Set connection var ansible_shell_executable to /bin/sh 13355 1727096159.55735: Set connection var ansible_shell_type to sh 13355 1727096159.55745: Set connection var ansible_pipelining to False 13355 1727096159.55758: Set connection var ansible_connection to ssh 13355 1727096159.55771: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096159.55782: Set connection var ansible_timeout to 10 13355 1727096159.55895: variable 'ansible_shell_executable' from source: unknown 13355 1727096159.55906: variable 'ansible_connection' from source: unknown 13355 1727096159.55914: variable 'ansible_module_compression' from source: unknown 13355 1727096159.55926: variable 'ansible_shell_type' from source: unknown 13355 1727096159.56033: variable 'ansible_shell_executable' from source: unknown 13355 1727096159.56036: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.56038: variable 'ansible_pipelining' from source: unknown 13355 1727096159.56043: variable 'ansible_timeout' from source: unknown 13355 1727096159.56045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.56576: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096159.56581: variable 'omit' from source: magic vars 13355 1727096159.56583: starting attempt loop 13355 1727096159.56586: running the handler 13355 1727096159.56589: _low_level_execute_command(): starting 13355 1727096159.56592: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096159.57819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096159.57840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096159.57853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096159.57873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096159.57889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096159.57901: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096159.57918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.57983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.58025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.58050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.58065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.58134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.59858: stdout chunk (state=3): >>>/root <<< 13355 1727096159.59998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.60012: stdout chunk (state=3): >>><<< 13355 1727096159.60025: stderr chunk (state=3): >>><<< 13355 1727096159.60055: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096159.60081: _low_level_execute_command(): starting 13355 1727096159.60092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513 `" && echo ansible-tmp-1727096159.6006339-13792-82499068316513="` echo /root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513 `" ) && sleep 0' 13355 1727096159.60751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096159.60782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096159.60799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096159.60817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096159.60840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096159.60853: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096159.60946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.60964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.60985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.61009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.61089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.63366: stdout chunk (state=3): >>>ansible-tmp-1727096159.6006339-13792-82499068316513=/root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513 <<< 13355 1727096159.63380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.63394: stdout chunk (state=3): >>><<< 13355 1727096159.63403: stderr chunk (state=3): >>><<< 13355 1727096159.63487: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096159.6006339-13792-82499068316513=/root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096159.63508: variable 'ansible_module_compression' from source: unknown 13355 1727096159.63602: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13355 1727096159.63622: variable 'ansible_facts' from source: unknown 13355 1727096159.63737: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/AnsiballZ_stat.py 13355 1727096159.63956: Sending initial data 13355 1727096159.63959: Sent initial data (152 bytes) 13355 1727096159.64601: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096159.64622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096159.64640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096159.64716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.64776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.64872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.64876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.66544: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096159.66591: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096159.66648: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpgy0tygfx /root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/AnsiballZ_stat.py <<< 13355 1727096159.66651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/AnsiballZ_stat.py" <<< 13355 1727096159.66696: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpgy0tygfx" to remote "/root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/AnsiballZ_stat.py" <<< 13355 1727096159.67669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.67702: stdout chunk (state=3): >>><<< 13355 1727096159.67706: stderr chunk (state=3): >>><<< 13355 1727096159.67732: done transferring module to remote 13355 1727096159.67811: _low_level_execute_command(): starting 13355 1727096159.67815: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/ /root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/AnsiballZ_stat.py && sleep 0' 13355 1727096159.68415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096159.68486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.68557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.68605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.68680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.70601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.70626: stdout chunk (state=3): >>><<< 13355 1727096159.70631: stderr chunk (state=3): >>><<< 13355 1727096159.70734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096159.70738: _low_level_execute_command(): starting 13355 1727096159.70740: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/AnsiballZ_stat.py && sleep 0' 13355 1727096159.71297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096159.71354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.71373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096159.71399: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.71486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.87478: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29174, "dev": 23, "nlink": 1, "atime": 1727096157.7005682, "mtime": 1727096157.7005682, "ctime": 1727096157.7005682, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13355 1727096159.88902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096159.88984: stdout chunk (state=3): >>><<< 13355 1727096159.89032: stderr chunk (state=3): >>><<< 13355 1727096159.89060: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29174, "dev": 23, "nlink": 1, "atime": 1727096157.7005682, "mtime": 1727096157.7005682, "ctime": 1727096157.7005682, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096159.89439: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096159.89443: _low_level_execute_command(): starting 13355 1727096159.89445: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096159.6006339-13792-82499068316513/ > /dev/null 2>&1 && sleep 0' 13355 1727096159.90642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096159.90880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096159.91018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096159.91050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096159.93015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096159.93063: stderr chunk (state=3): >>><<< 13355 1727096159.93094: stdout chunk (state=3): >>><<< 13355 1727096159.93291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096159.93295: handler run complete 13355 1727096159.93299: attempt loop complete, returning result 13355 1727096159.93301: _execute() done 13355 1727096159.93303: dumping result to json 13355 1727096159.93306: done dumping result, returning 13355 1727096159.93308: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [0afff68d-5257-c514-593f-000000000260] 13355 1727096159.93310: sending task result for task 0afff68d-5257-c514-593f-000000000260 ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096157.7005682, "block_size": 4096, "blocks": 0, "ctime": 1727096157.7005682, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29174, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727096157.7005682, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13355 1727096159.93835: no more pending results, returning what we have 13355 1727096159.93838: results queue empty 13355 1727096159.93839: checking for any_errors_fatal 13355 1727096159.93840: done checking for any_errors_fatal 13355 1727096159.93841: checking for max_fail_percentage 13355 1727096159.93842: done checking for max_fail_percentage 13355 1727096159.93843: checking to see if all hosts have failed and the running result is not ok 13355 1727096159.93844: done checking to see if all hosts have failed 13355 1727096159.93844: getting the remaining hosts for this loop 13355 1727096159.93846: done getting the remaining hosts for this loop 13355 1727096159.93849: getting the next task for host managed_node3 13355 1727096159.93862: done getting next task for host managed_node3 13355 1727096159.93865: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13355 1727096159.93868: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096159.93873: getting variables 13355 1727096159.93874: in VariableManager get_vars() 13355 1727096159.93927: Calling all_inventory to load vars for managed_node3 13355 1727096159.93930: Calling groups_inventory to load vars for managed_node3 13355 1727096159.93932: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096159.93944: Calling all_plugins_play to load vars for managed_node3 13355 1727096159.93946: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096159.93949: Calling groups_plugins_play to load vars for managed_node3 13355 1727096159.94512: done sending task result for task 0afff68d-5257-c514-593f-000000000260 13355 1727096159.94516: WORKER PROCESS EXITING 13355 1727096159.94793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096159.95060: done with get_vars() 13355 1727096159.95073: done getting variables 13355 1727096159.95129: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096159.95258: variable 'interface' from source: task vars 13355 1727096159.95263: variable 'dhcp_interface2' from source: play vars 13355 1727096159.95326: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:55:59 -0400 (0:00:00.420) 0:00:09.214 ****** 13355 1727096159.95360: entering _queue_task() for managed_node3/assert 13355 1727096159.95668: worker is 1 (out of 1 available) 13355 1727096159.95681: exiting _queue_task() for managed_node3/assert 13355 1727096159.95694: done queuing things up, now waiting for results queue to drain 13355 1727096159.95695: waiting for pending results... 13355 1727096159.96161: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 13355 1727096159.96271: in run() - task 0afff68d-5257-c514-593f-00000000001c 13355 1727096159.96296: variable 'ansible_search_path' from source: unknown 13355 1727096159.96303: variable 'ansible_search_path' from source: unknown 13355 1727096159.96444: calling self._execute() 13355 1727096159.96565: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.96582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.96598: variable 'omit' from source: magic vars 13355 1727096159.97035: variable 'ansible_distribution_major_version' from source: facts 13355 1727096159.97107: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096159.97115: variable 'omit' from source: magic vars 13355 1727096159.97135: variable 'omit' from source: magic vars 13355 1727096159.97262: variable 'interface' from source: task vars 13355 1727096159.97274: variable 'dhcp_interface2' from source: play vars 13355 1727096159.97358: variable 'dhcp_interface2' from source: play vars 13355 1727096159.97436: variable 'omit' from source: magic vars 13355 1727096159.97441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096159.97490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096159.97545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096159.97549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096159.97570: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096159.97606: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096159.97616: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.97657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.97744: Set connection var ansible_shell_executable to /bin/sh 13355 1727096159.97764: Set connection var ansible_shell_type to sh 13355 1727096159.97784: Set connection var ansible_pipelining to False 13355 1727096159.97794: Set connection var ansible_connection to ssh 13355 1727096159.97906: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096159.97910: Set connection var ansible_timeout to 10 13355 1727096159.97913: variable 'ansible_shell_executable' from source: unknown 13355 1727096159.97915: variable 'ansible_connection' from source: unknown 13355 1727096159.97917: variable 'ansible_module_compression' from source: unknown 13355 1727096159.97919: variable 'ansible_shell_type' from source: unknown 13355 1727096159.97921: variable 'ansible_shell_executable' from source: unknown 13355 1727096159.97929: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096159.97935: variable 'ansible_pipelining' from source: unknown 13355 1727096159.97974: variable 'ansible_timeout' from source: unknown 13355 1727096159.97987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096159.98199: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096159.98233: variable 'omit' from source: magic vars 13355 1727096159.98236: starting attempt loop 13355 1727096159.98238: running the handler 13355 1727096159.98396: variable 'interface_stat' from source: set_fact 13355 1727096159.98426: Evaluated conditional (interface_stat.stat.exists): True 13355 1727096159.98450: handler run complete 13355 1727096159.98526: attempt loop complete, returning result 13355 1727096159.98529: _execute() done 13355 1727096159.98531: dumping result to json 13355 1727096159.98534: done dumping result, returning 13355 1727096159.98536: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [0afff68d-5257-c514-593f-00000000001c] 13355 1727096159.98537: sending task result for task 0afff68d-5257-c514-593f-00000000001c 13355 1727096159.98612: done sending task result for task 0afff68d-5257-c514-593f-00000000001c 13355 1727096159.98615: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096159.98681: no more pending results, returning what we have 13355 1727096159.98685: results queue empty 13355 1727096159.98685: checking for any_errors_fatal 13355 1727096159.98693: done checking for any_errors_fatal 13355 1727096159.98694: checking for max_fail_percentage 13355 1727096159.98695: done checking for max_fail_percentage 13355 1727096159.98696: checking to see if all hosts have failed and the running result is not ok 13355 1727096159.98697: done checking to see if all hosts have failed 13355 1727096159.98698: getting the remaining hosts for this loop 13355 1727096159.98699: done getting the remaining hosts for this loop 13355 1727096159.98702: getting the next task for host managed_node3 13355 1727096159.98710: done getting next task for host managed_node3 13355 1727096159.98713: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 13355 1727096159.98715: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096159.98718: getting variables 13355 1727096159.98720: in VariableManager get_vars() 13355 1727096159.98784: Calling all_inventory to load vars for managed_node3 13355 1727096159.98787: Calling groups_inventory to load vars for managed_node3 13355 1727096159.98790: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096159.98801: Calling all_plugins_play to load vars for managed_node3 13355 1727096159.98805: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096159.98807: Calling groups_plugins_play to load vars for managed_node3 13355 1727096159.99319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096159.99551: done with get_vars() 13355 1727096159.99564: done getting variables 13355 1727096159.99660: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:28 Monday 23 September 2024 08:55:59 -0400 (0:00:00.043) 0:00:09.257 ****** 13355 1727096159.99734: entering _queue_task() for managed_node3/command 13355 1727096160.00043: worker is 1 (out of 1 available) 13355 1727096160.00062: exiting _queue_task() for managed_node3/command 13355 1727096160.00081: done queuing things up, now waiting for results queue to drain 13355 1727096160.00082: waiting for pending results... 13355 1727096160.00485: running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript 13355 1727096160.00491: in run() - task 0afff68d-5257-c514-593f-00000000001d 13355 1727096160.00494: variable 'ansible_search_path' from source: unknown 13355 1727096160.00496: calling self._execute() 13355 1727096160.00573: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.00577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.00611: variable 'omit' from source: magic vars 13355 1727096160.01333: variable 'ansible_distribution_major_version' from source: facts 13355 1727096160.01349: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096160.01453: variable 'network_provider' from source: set_fact 13355 1727096160.01473: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096160.01476: when evaluation is False, skipping this task 13355 1727096160.01478: _execute() done 13355 1727096160.01480: dumping result to json 13355 1727096160.01672: done dumping result, returning 13355 1727096160.01676: done running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript [0afff68d-5257-c514-593f-00000000001d] 13355 1727096160.01678: sending task result for task 0afff68d-5257-c514-593f-00000000001d 13355 1727096160.01751: done sending task result for task 0afff68d-5257-c514-593f-00000000001d 13355 1727096160.01754: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096160.01802: no more pending results, returning what we have 13355 1727096160.01807: results queue empty 13355 1727096160.01808: checking for any_errors_fatal 13355 1727096160.01817: done checking for any_errors_fatal 13355 1727096160.01818: checking for max_fail_percentage 13355 1727096160.01820: done checking for max_fail_percentage 13355 1727096160.01821: checking to see if all hosts have failed and the running result is not ok 13355 1727096160.01822: done checking to see if all hosts have failed 13355 1727096160.01823: getting the remaining hosts for this loop 13355 1727096160.01824: done getting the remaining hosts for this loop 13355 1727096160.01829: getting the next task for host managed_node3 13355 1727096160.01839: done getting next task for host managed_node3 13355 1727096160.01842: ^ task is: TASK: TEST Add Bond with 2 ports 13355 1727096160.01845: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096160.01848: getting variables 13355 1727096160.01850: in VariableManager get_vars() 13355 1727096160.01908: Calling all_inventory to load vars for managed_node3 13355 1727096160.01911: Calling groups_inventory to load vars for managed_node3 13355 1727096160.02334: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096160.02344: Calling all_plugins_play to load vars for managed_node3 13355 1727096160.02347: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096160.02350: Calling groups_plugins_play to load vars for managed_node3 13355 1727096160.02498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096160.02691: done with get_vars() 13355 1727096160.02703: done getting variables 13355 1727096160.02759: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:33 Monday 23 September 2024 08:56:00 -0400 (0:00:00.030) 0:00:09.288 ****** 13355 1727096160.02788: entering _queue_task() for managed_node3/debug 13355 1727096160.03403: worker is 1 (out of 1 available) 13355 1727096160.03414: exiting _queue_task() for managed_node3/debug 13355 1727096160.03426: done queuing things up, now waiting for results queue to drain 13355 1727096160.03427: waiting for pending results... 13355 1727096160.03505: running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports 13355 1727096160.03876: in run() - task 0afff68d-5257-c514-593f-00000000001e 13355 1727096160.03984: variable 'ansible_search_path' from source: unknown 13355 1727096160.03988: calling self._execute() 13355 1727096160.04042: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.04101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.04118: variable 'omit' from source: magic vars 13355 1727096160.04646: variable 'ansible_distribution_major_version' from source: facts 13355 1727096160.04667: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096160.04682: variable 'omit' from source: magic vars 13355 1727096160.04709: variable 'omit' from source: magic vars 13355 1727096160.04757: variable 'omit' from source: magic vars 13355 1727096160.04806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096160.04851: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096160.04884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096160.04907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096160.04926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096160.04966: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096160.04981: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.04990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.05100: Set connection var ansible_shell_executable to /bin/sh 13355 1727096160.05112: Set connection var ansible_shell_type to sh 13355 1727096160.05122: Set connection var ansible_pipelining to False 13355 1727096160.05131: Set connection var ansible_connection to ssh 13355 1727096160.05141: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096160.05150: Set connection var ansible_timeout to 10 13355 1727096160.05182: variable 'ansible_shell_executable' from source: unknown 13355 1727096160.05197: variable 'ansible_connection' from source: unknown 13355 1727096160.05205: variable 'ansible_module_compression' from source: unknown 13355 1727096160.05213: variable 'ansible_shell_type' from source: unknown 13355 1727096160.05220: variable 'ansible_shell_executable' from source: unknown 13355 1727096160.05226: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.05234: variable 'ansible_pipelining' from source: unknown 13355 1727096160.05300: variable 'ansible_timeout' from source: unknown 13355 1727096160.05304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.05399: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096160.05422: variable 'omit' from source: magic vars 13355 1727096160.05433: starting attempt loop 13355 1727096160.05440: running the handler 13355 1727096160.05495: handler run complete 13355 1727096160.05521: attempt loop complete, returning result 13355 1727096160.05528: _execute() done 13355 1727096160.05535: dumping result to json 13355 1727096160.05543: done dumping result, returning 13355 1727096160.05556: done running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports [0afff68d-5257-c514-593f-00000000001e] 13355 1727096160.05569: sending task result for task 0afff68d-5257-c514-593f-00000000001e 13355 1727096160.05700: done sending task result for task 0afff68d-5257-c514-593f-00000000001e 13355 1727096160.05703: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 13355 1727096160.05785: no more pending results, returning what we have 13355 1727096160.05788: results queue empty 13355 1727096160.05789: checking for any_errors_fatal 13355 1727096160.05799: done checking for any_errors_fatal 13355 1727096160.05799: checking for max_fail_percentage 13355 1727096160.05801: done checking for max_fail_percentage 13355 1727096160.05802: checking to see if all hosts have failed and the running result is not ok 13355 1727096160.05803: done checking to see if all hosts have failed 13355 1727096160.05803: getting the remaining hosts for this loop 13355 1727096160.05805: done getting the remaining hosts for this loop 13355 1727096160.05809: getting the next task for host managed_node3 13355 1727096160.05817: done getting next task for host managed_node3 13355 1727096160.05822: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096160.05826: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096160.05842: getting variables 13355 1727096160.05844: in VariableManager get_vars() 13355 1727096160.05907: Calling all_inventory to load vars for managed_node3 13355 1727096160.05910: Calling groups_inventory to load vars for managed_node3 13355 1727096160.05913: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096160.05925: Calling all_plugins_play to load vars for managed_node3 13355 1727096160.05928: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096160.05930: Calling groups_plugins_play to load vars for managed_node3 13355 1727096160.06224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096160.06993: done with get_vars() 13355 1727096160.07005: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:56:00 -0400 (0:00:00.043) 0:00:09.331 ****** 13355 1727096160.07124: entering _queue_task() for managed_node3/include_tasks 13355 1727096160.07727: worker is 1 (out of 1 available) 13355 1727096160.07961: exiting _queue_task() for managed_node3/include_tasks 13355 1727096160.08077: done queuing things up, now waiting for results queue to drain 13355 1727096160.08078: waiting for pending results... 13355 1727096160.08512: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096160.08809: in run() - task 0afff68d-5257-c514-593f-000000000026 13355 1727096160.08813: variable 'ansible_search_path' from source: unknown 13355 1727096160.08815: variable 'ansible_search_path' from source: unknown 13355 1727096160.08974: calling self._execute() 13355 1727096160.09245: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.09249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.09251: variable 'omit' from source: magic vars 13355 1727096160.09740: variable 'ansible_distribution_major_version' from source: facts 13355 1727096160.09763: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096160.09779: _execute() done 13355 1727096160.09791: dumping result to json 13355 1727096160.09800: done dumping result, returning 13355 1727096160.09813: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-c514-593f-000000000026] 13355 1727096160.09825: sending task result for task 0afff68d-5257-c514-593f-000000000026 13355 1727096160.10059: no more pending results, returning what we have 13355 1727096160.10064: in VariableManager get_vars() 13355 1727096160.10137: Calling all_inventory to load vars for managed_node3 13355 1727096160.10140: Calling groups_inventory to load vars for managed_node3 13355 1727096160.10143: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096160.10160: Calling all_plugins_play to load vars for managed_node3 13355 1727096160.10165: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096160.10170: Calling groups_plugins_play to load vars for managed_node3 13355 1727096160.10587: done sending task result for task 0afff68d-5257-c514-593f-000000000026 13355 1727096160.10590: WORKER PROCESS EXITING 13355 1727096160.10617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096160.10828: done with get_vars() 13355 1727096160.10840: variable 'ansible_search_path' from source: unknown 13355 1727096160.10841: variable 'ansible_search_path' from source: unknown 13355 1727096160.10890: we have included files to process 13355 1727096160.10891: generating all_blocks data 13355 1727096160.10893: done generating all_blocks data 13355 1727096160.10898: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096160.10900: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096160.10902: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096160.11669: done processing included file 13355 1727096160.11672: iterating over new_blocks loaded from include file 13355 1727096160.11674: in VariableManager get_vars() 13355 1727096160.11708: done with get_vars() 13355 1727096160.11710: filtering new block on tags 13355 1727096160.11729: done filtering new block on tags 13355 1727096160.11732: in VariableManager get_vars() 13355 1727096160.11771: done with get_vars() 13355 1727096160.11773: filtering new block on tags 13355 1727096160.11793: done filtering new block on tags 13355 1727096160.11796: in VariableManager get_vars() 13355 1727096160.11825: done with get_vars() 13355 1727096160.11827: filtering new block on tags 13355 1727096160.11845: done filtering new block on tags 13355 1727096160.11847: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13355 1727096160.11855: extending task lists for all hosts with included blocks 13355 1727096160.13374: done extending task lists 13355 1727096160.13376: done processing included files 13355 1727096160.13377: results queue empty 13355 1727096160.13378: checking for any_errors_fatal 13355 1727096160.13381: done checking for any_errors_fatal 13355 1727096160.13381: checking for max_fail_percentage 13355 1727096160.13382: done checking for max_fail_percentage 13355 1727096160.13383: checking to see if all hosts have failed and the running result is not ok 13355 1727096160.13384: done checking to see if all hosts have failed 13355 1727096160.13385: getting the remaining hosts for this loop 13355 1727096160.13386: done getting the remaining hosts for this loop 13355 1727096160.13389: getting the next task for host managed_node3 13355 1727096160.13392: done getting next task for host managed_node3 13355 1727096160.13395: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096160.13398: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096160.13408: getting variables 13355 1727096160.13409: in VariableManager get_vars() 13355 1727096160.13435: Calling all_inventory to load vars for managed_node3 13355 1727096160.13438: Calling groups_inventory to load vars for managed_node3 13355 1727096160.13439: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096160.13446: Calling all_plugins_play to load vars for managed_node3 13355 1727096160.13448: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096160.13454: Calling groups_plugins_play to load vars for managed_node3 13355 1727096160.13619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096160.13816: done with get_vars() 13355 1727096160.13829: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:56:00 -0400 (0:00:00.067) 0:00:09.399 ****** 13355 1727096160.13916: entering _queue_task() for managed_node3/setup 13355 1727096160.14251: worker is 1 (out of 1 available) 13355 1727096160.14471: exiting _queue_task() for managed_node3/setup 13355 1727096160.14483: done queuing things up, now waiting for results queue to drain 13355 1727096160.14484: waiting for pending results... 13355 1727096160.14556: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096160.14729: in run() - task 0afff68d-5257-c514-593f-00000000027e 13355 1727096160.14751: variable 'ansible_search_path' from source: unknown 13355 1727096160.14763: variable 'ansible_search_path' from source: unknown 13355 1727096160.14806: calling self._execute() 13355 1727096160.14907: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.14919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.14941: variable 'omit' from source: magic vars 13355 1727096160.15316: variable 'ansible_distribution_major_version' from source: facts 13355 1727096160.15333: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096160.15548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096160.17780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096160.17877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096160.17980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096160.17984: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096160.17999: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096160.18086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096160.18125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096160.18158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096160.18273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096160.18276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096160.18292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096160.18327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096160.18360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096160.18405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096160.18432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096160.18590: variable '__network_required_facts' from source: role '' defaults 13355 1727096160.18603: variable 'ansible_facts' from source: unknown 13355 1727096160.18702: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13355 1727096160.18709: when evaluation is False, skipping this task 13355 1727096160.18715: _execute() done 13355 1727096160.18743: dumping result to json 13355 1727096160.18746: done dumping result, returning 13355 1727096160.18749: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-c514-593f-00000000027e] 13355 1727096160.18751: sending task result for task 0afff68d-5257-c514-593f-00000000027e skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096160.19016: no more pending results, returning what we have 13355 1727096160.19020: results queue empty 13355 1727096160.19020: checking for any_errors_fatal 13355 1727096160.19022: done checking for any_errors_fatal 13355 1727096160.19022: checking for max_fail_percentage 13355 1727096160.19024: done checking for max_fail_percentage 13355 1727096160.19024: checking to see if all hosts have failed and the running result is not ok 13355 1727096160.19025: done checking to see if all hosts have failed 13355 1727096160.19026: getting the remaining hosts for this loop 13355 1727096160.19027: done getting the remaining hosts for this loop 13355 1727096160.19031: getting the next task for host managed_node3 13355 1727096160.19039: done getting next task for host managed_node3 13355 1727096160.19043: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096160.19047: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096160.19063: getting variables 13355 1727096160.19064: in VariableManager get_vars() 13355 1727096160.19119: Calling all_inventory to load vars for managed_node3 13355 1727096160.19121: Calling groups_inventory to load vars for managed_node3 13355 1727096160.19124: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096160.19135: Calling all_plugins_play to load vars for managed_node3 13355 1727096160.19138: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096160.19141: Calling groups_plugins_play to load vars for managed_node3 13355 1727096160.19503: done sending task result for task 0afff68d-5257-c514-593f-00000000027e 13355 1727096160.19507: WORKER PROCESS EXITING 13355 1727096160.19530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096160.19739: done with get_vars() 13355 1727096160.19755: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:56:00 -0400 (0:00:00.059) 0:00:09.459 ****** 13355 1727096160.19866: entering _queue_task() for managed_node3/stat 13355 1727096160.20159: worker is 1 (out of 1 available) 13355 1727096160.20275: exiting _queue_task() for managed_node3/stat 13355 1727096160.20286: done queuing things up, now waiting for results queue to drain 13355 1727096160.20287: waiting for pending results... 13355 1727096160.20443: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096160.20605: in run() - task 0afff68d-5257-c514-593f-000000000280 13355 1727096160.20631: variable 'ansible_search_path' from source: unknown 13355 1727096160.20640: variable 'ansible_search_path' from source: unknown 13355 1727096160.20685: calling self._execute() 13355 1727096160.20778: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.20789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.20801: variable 'omit' from source: magic vars 13355 1727096160.21172: variable 'ansible_distribution_major_version' from source: facts 13355 1727096160.21190: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096160.21349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096160.21709: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096160.21764: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096160.21810: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096160.21855: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096160.21951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096160.21989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096160.22025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096160.22061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096160.22165: variable '__network_is_ostree' from source: set_fact 13355 1727096160.22181: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096160.22189: when evaluation is False, skipping this task 13355 1727096160.22196: _execute() done 13355 1727096160.22227: dumping result to json 13355 1727096160.22231: done dumping result, returning 13355 1727096160.22234: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-c514-593f-000000000280] 13355 1727096160.22236: sending task result for task 0afff68d-5257-c514-593f-000000000280 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096160.22396: no more pending results, returning what we have 13355 1727096160.22400: results queue empty 13355 1727096160.22401: checking for any_errors_fatal 13355 1727096160.22407: done checking for any_errors_fatal 13355 1727096160.22408: checking for max_fail_percentage 13355 1727096160.22410: done checking for max_fail_percentage 13355 1727096160.22410: checking to see if all hosts have failed and the running result is not ok 13355 1727096160.22411: done checking to see if all hosts have failed 13355 1727096160.22412: getting the remaining hosts for this loop 13355 1727096160.22413: done getting the remaining hosts for this loop 13355 1727096160.22417: getting the next task for host managed_node3 13355 1727096160.22424: done getting next task for host managed_node3 13355 1727096160.22428: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096160.22432: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096160.22446: getting variables 13355 1727096160.22448: in VariableManager get_vars() 13355 1727096160.22511: Calling all_inventory to load vars for managed_node3 13355 1727096160.22514: Calling groups_inventory to load vars for managed_node3 13355 1727096160.22517: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096160.22528: Calling all_plugins_play to load vars for managed_node3 13355 1727096160.22531: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096160.22534: Calling groups_plugins_play to load vars for managed_node3 13355 1727096160.23063: done sending task result for task 0afff68d-5257-c514-593f-000000000280 13355 1727096160.23067: WORKER PROCESS EXITING 13355 1727096160.23092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096160.23302: done with get_vars() 13355 1727096160.23312: done getting variables 13355 1727096160.23374: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:56:00 -0400 (0:00:00.035) 0:00:09.494 ****** 13355 1727096160.23409: entering _queue_task() for managed_node3/set_fact 13355 1727096160.23887: worker is 1 (out of 1 available) 13355 1727096160.23896: exiting _queue_task() for managed_node3/set_fact 13355 1727096160.23908: done queuing things up, now waiting for results queue to drain 13355 1727096160.23909: waiting for pending results... 13355 1727096160.23984: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096160.24136: in run() - task 0afff68d-5257-c514-593f-000000000281 13355 1727096160.24155: variable 'ansible_search_path' from source: unknown 13355 1727096160.24163: variable 'ansible_search_path' from source: unknown 13355 1727096160.24200: calling self._execute() 13355 1727096160.24285: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.24295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.24306: variable 'omit' from source: magic vars 13355 1727096160.24666: variable 'ansible_distribution_major_version' from source: facts 13355 1727096160.24690: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096160.24860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096160.25145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096160.25197: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096160.25239: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096160.25329: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096160.25373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096160.25405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096160.25441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096160.25478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096160.25572: variable '__network_is_ostree' from source: set_fact 13355 1727096160.25585: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096160.25593: when evaluation is False, skipping this task 13355 1727096160.25600: _execute() done 13355 1727096160.25658: dumping result to json 13355 1727096160.25661: done dumping result, returning 13355 1727096160.25664: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-c514-593f-000000000281] 13355 1727096160.25666: sending task result for task 0afff68d-5257-c514-593f-000000000281 13355 1727096160.25733: done sending task result for task 0afff68d-5257-c514-593f-000000000281 13355 1727096160.25736: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096160.25810: no more pending results, returning what we have 13355 1727096160.25814: results queue empty 13355 1727096160.25815: checking for any_errors_fatal 13355 1727096160.25821: done checking for any_errors_fatal 13355 1727096160.25822: checking for max_fail_percentage 13355 1727096160.25824: done checking for max_fail_percentage 13355 1727096160.25824: checking to see if all hosts have failed and the running result is not ok 13355 1727096160.25825: done checking to see if all hosts have failed 13355 1727096160.25826: getting the remaining hosts for this loop 13355 1727096160.25827: done getting the remaining hosts for this loop 13355 1727096160.25832: getting the next task for host managed_node3 13355 1727096160.25841: done getting next task for host managed_node3 13355 1727096160.25845: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096160.25850: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096160.25865: getting variables 13355 1727096160.25869: in VariableManager get_vars() 13355 1727096160.25919: Calling all_inventory to load vars for managed_node3 13355 1727096160.25922: Calling groups_inventory to load vars for managed_node3 13355 1727096160.25924: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096160.25933: Calling all_plugins_play to load vars for managed_node3 13355 1727096160.25936: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096160.25938: Calling groups_plugins_play to load vars for managed_node3 13355 1727096160.26345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096160.26565: done with get_vars() 13355 1727096160.26578: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:56:00 -0400 (0:00:00.032) 0:00:09.527 ****** 13355 1727096160.26673: entering _queue_task() for managed_node3/service_facts 13355 1727096160.26675: Creating lock for service_facts 13355 1727096160.26936: worker is 1 (out of 1 available) 13355 1727096160.26948: exiting _queue_task() for managed_node3/service_facts 13355 1727096160.26964: done queuing things up, now waiting for results queue to drain 13355 1727096160.26965: waiting for pending results... 13355 1727096160.27294: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096160.27500: in run() - task 0afff68d-5257-c514-593f-000000000283 13355 1727096160.27504: variable 'ansible_search_path' from source: unknown 13355 1727096160.27507: variable 'ansible_search_path' from source: unknown 13355 1727096160.27510: calling self._execute() 13355 1727096160.27561: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.27575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.27591: variable 'omit' from source: magic vars 13355 1727096160.27983: variable 'ansible_distribution_major_version' from source: facts 13355 1727096160.28001: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096160.28012: variable 'omit' from source: magic vars 13355 1727096160.28097: variable 'omit' from source: magic vars 13355 1727096160.28137: variable 'omit' from source: magic vars 13355 1727096160.28189: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096160.28232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096160.28265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096160.28290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096160.28308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096160.28343: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096160.28366: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.28371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.28475: Set connection var ansible_shell_executable to /bin/sh 13355 1727096160.28573: Set connection var ansible_shell_type to sh 13355 1727096160.28577: Set connection var ansible_pipelining to False 13355 1727096160.28580: Set connection var ansible_connection to ssh 13355 1727096160.28582: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096160.28584: Set connection var ansible_timeout to 10 13355 1727096160.28586: variable 'ansible_shell_executable' from source: unknown 13355 1727096160.28589: variable 'ansible_connection' from source: unknown 13355 1727096160.28591: variable 'ansible_module_compression' from source: unknown 13355 1727096160.28593: variable 'ansible_shell_type' from source: unknown 13355 1727096160.28595: variable 'ansible_shell_executable' from source: unknown 13355 1727096160.28597: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096160.28599: variable 'ansible_pipelining' from source: unknown 13355 1727096160.28601: variable 'ansible_timeout' from source: unknown 13355 1727096160.28603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096160.28827: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096160.28832: variable 'omit' from source: magic vars 13355 1727096160.28834: starting attempt loop 13355 1727096160.28836: running the handler 13355 1727096160.28839: _low_level_execute_command(): starting 13355 1727096160.28856: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096160.29661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096160.29706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096160.29727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096160.29730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096160.29812: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096160.29835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096160.29847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096160.29921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096160.31628: stdout chunk (state=3): >>>/root <<< 13355 1727096160.31794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096160.31798: stdout chunk (state=3): >>><<< 13355 1727096160.31800: stderr chunk (state=3): >>><<< 13355 1727096160.31931: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096160.31936: _low_level_execute_command(): starting 13355 1727096160.31939: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095 `" && echo ansible-tmp-1727096160.3182464-13838-166730863712095="` echo /root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095 `" ) && sleep 0' 13355 1727096160.32562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096160.32611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096160.32642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096160.32659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096160.32724: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096160.32759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096160.32780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096160.32841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096160.32883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096160.34884: stdout chunk (state=3): >>>ansible-tmp-1727096160.3182464-13838-166730863712095=/root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095 <<< 13355 1727096160.35050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096160.35057: stdout chunk (state=3): >>><<< 13355 1727096160.35061: stderr chunk (state=3): >>><<< 13355 1727096160.35273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096160.3182464-13838-166730863712095=/root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096160.35276: variable 'ansible_module_compression' from source: unknown 13355 1727096160.35278: ANSIBALLZ: Using lock for service_facts 13355 1727096160.35280: ANSIBALLZ: Acquiring lock 13355 1727096160.35281: ANSIBALLZ: Lock acquired: 140397098206704 13355 1727096160.35283: ANSIBALLZ: Creating module 13355 1727096160.59482: ANSIBALLZ: Writing module into payload 13355 1727096160.59588: ANSIBALLZ: Writing module 13355 1727096160.59628: ANSIBALLZ: Renaming module 13355 1727096160.59641: ANSIBALLZ: Done creating module 13355 1727096160.59664: variable 'ansible_facts' from source: unknown 13355 1727096160.59749: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/AnsiballZ_service_facts.py 13355 1727096160.59958: Sending initial data 13355 1727096160.59961: Sent initial data (162 bytes) 13355 1727096160.60591: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096160.60616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096160.60698: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096160.60843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096160.61048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096160.62653: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096160.63176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpi69ot7q3" to remote "/root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/AnsiballZ_service_facts.py" <<< 13355 1727096160.63180: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpi69ot7q3 /root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/AnsiballZ_service_facts.py <<< 13355 1727096160.64072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096160.64142: stderr chunk (state=3): >>><<< 13355 1727096160.64146: stdout chunk (state=3): >>><<< 13355 1727096160.64171: done transferring module to remote 13355 1727096160.64185: _low_level_execute_command(): starting 13355 1727096160.64190: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/ /root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/AnsiballZ_service_facts.py && sleep 0' 13355 1727096160.65690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096160.67628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096160.67680: stderr chunk (state=3): >>><<< 13355 1727096160.67684: stdout chunk (state=3): >>><<< 13355 1727096160.67701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096160.67704: _low_level_execute_command(): starting 13355 1727096160.67774: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/AnsiballZ_service_facts.py && sleep 0' 13355 1727096160.69453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096160.69591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096160.69608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096160.69791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096162.41217: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 13355 1727096162.41251: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 13355 1727096162.41297: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13355 1727096162.43018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096162.43022: stdout chunk (state=3): >>><<< 13355 1727096162.43030: stderr chunk (state=3): >>><<< 13355 1727096162.43054: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096162.43875: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096162.43880: _low_level_execute_command(): starting 13355 1727096162.43882: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096160.3182464-13838-166730863712095/ > /dev/null 2>&1 && sleep 0' 13355 1727096162.44468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096162.44477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096162.44506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096162.44600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096162.44623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096162.44685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096162.46776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096162.46780: stdout chunk (state=3): >>><<< 13355 1727096162.46783: stderr chunk (state=3): >>><<< 13355 1727096162.46785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096162.46787: handler run complete 13355 1727096162.46914: variable 'ansible_facts' from source: unknown 13355 1727096162.48497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096162.49010: variable 'ansible_facts' from source: unknown 13355 1727096162.49147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096162.49365: attempt loop complete, returning result 13355 1727096162.49374: _execute() done 13355 1727096162.49376: dumping result to json 13355 1727096162.49453: done dumping result, returning 13355 1727096162.49457: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-c514-593f-000000000283] 13355 1727096162.49459: sending task result for task 0afff68d-5257-c514-593f-000000000283 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096162.50433: no more pending results, returning what we have 13355 1727096162.50436: results queue empty 13355 1727096162.50437: checking for any_errors_fatal 13355 1727096162.50441: done checking for any_errors_fatal 13355 1727096162.50441: checking for max_fail_percentage 13355 1727096162.50443: done checking for max_fail_percentage 13355 1727096162.50443: checking to see if all hosts have failed and the running result is not ok 13355 1727096162.50444: done checking to see if all hosts have failed 13355 1727096162.50445: getting the remaining hosts for this loop 13355 1727096162.50446: done getting the remaining hosts for this loop 13355 1727096162.50450: getting the next task for host managed_node3 13355 1727096162.50458: done getting next task for host managed_node3 13355 1727096162.50461: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096162.50465: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096162.50481: done sending task result for task 0afff68d-5257-c514-593f-000000000283 13355 1727096162.50484: WORKER PROCESS EXITING 13355 1727096162.50491: getting variables 13355 1727096162.50492: in VariableManager get_vars() 13355 1727096162.50534: Calling all_inventory to load vars for managed_node3 13355 1727096162.50536: Calling groups_inventory to load vars for managed_node3 13355 1727096162.50539: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096162.50547: Calling all_plugins_play to load vars for managed_node3 13355 1727096162.50549: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096162.50555: Calling groups_plugins_play to load vars for managed_node3 13355 1727096162.50970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096162.51472: done with get_vars() 13355 1727096162.51490: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:56:02 -0400 (0:00:02.249) 0:00:11.776 ****** 13355 1727096162.51594: entering _queue_task() for managed_node3/package_facts 13355 1727096162.51600: Creating lock for package_facts 13355 1727096162.51943: worker is 1 (out of 1 available) 13355 1727096162.51957: exiting _queue_task() for managed_node3/package_facts 13355 1727096162.52176: done queuing things up, now waiting for results queue to drain 13355 1727096162.52178: waiting for pending results... 13355 1727096162.52622: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096162.52715: in run() - task 0afff68d-5257-c514-593f-000000000284 13355 1727096162.52756: variable 'ansible_search_path' from source: unknown 13355 1727096162.52765: variable 'ansible_search_path' from source: unknown 13355 1727096162.52810: calling self._execute() 13355 1727096162.52915: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096162.52928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096162.52947: variable 'omit' from source: magic vars 13355 1727096162.53356: variable 'ansible_distribution_major_version' from source: facts 13355 1727096162.53390: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096162.53474: variable 'omit' from source: magic vars 13355 1727096162.53482: variable 'omit' from source: magic vars 13355 1727096162.53526: variable 'omit' from source: magic vars 13355 1727096162.53574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096162.53623: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096162.53647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096162.53673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096162.53689: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096162.53731: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096162.53739: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096162.53746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096162.53871: Set connection var ansible_shell_executable to /bin/sh 13355 1727096162.53883: Set connection var ansible_shell_type to sh 13355 1727096162.53893: Set connection var ansible_pipelining to False 13355 1727096162.53901: Set connection var ansible_connection to ssh 13355 1727096162.53915: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096162.53930: Set connection var ansible_timeout to 10 13355 1727096162.54027: variable 'ansible_shell_executable' from source: unknown 13355 1727096162.54031: variable 'ansible_connection' from source: unknown 13355 1727096162.54039: variable 'ansible_module_compression' from source: unknown 13355 1727096162.54041: variable 'ansible_shell_type' from source: unknown 13355 1727096162.54043: variable 'ansible_shell_executable' from source: unknown 13355 1727096162.54045: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096162.54048: variable 'ansible_pipelining' from source: unknown 13355 1727096162.54049: variable 'ansible_timeout' from source: unknown 13355 1727096162.54051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096162.54298: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096162.54315: variable 'omit' from source: magic vars 13355 1727096162.54325: starting attempt loop 13355 1727096162.54331: running the handler 13355 1727096162.54350: _low_level_execute_command(): starting 13355 1727096162.54378: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096162.55256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096162.55321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096162.55360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096162.55385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096162.55457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096162.57349: stdout chunk (state=3): >>>/root <<< 13355 1727096162.57389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096162.57397: stdout chunk (state=3): >>><<< 13355 1727096162.57399: stderr chunk (state=3): >>><<< 13355 1727096162.57676: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096162.57680: _low_level_execute_command(): starting 13355 1727096162.57683: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787 `" && echo ansible-tmp-1727096162.5758963-13931-262275975212787="` echo /root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787 `" ) && sleep 0' 13355 1727096162.58445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096162.58462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096162.58556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096162.58599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096162.58625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096162.58643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096162.58790: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096162.60812: stdout chunk (state=3): >>>ansible-tmp-1727096162.5758963-13931-262275975212787=/root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787 <<< 13355 1727096162.61000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096162.61026: stderr chunk (state=3): >>><<< 13355 1727096162.61030: stdout chunk (state=3): >>><<< 13355 1727096162.61050: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096162.5758963-13931-262275975212787=/root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096162.61477: variable 'ansible_module_compression' from source: unknown 13355 1727096162.61481: ANSIBALLZ: Using lock for package_facts 13355 1727096162.61483: ANSIBALLZ: Acquiring lock 13355 1727096162.61485: ANSIBALLZ: Lock acquired: 140397095858544 13355 1727096162.61487: ANSIBALLZ: Creating module 13355 1727096162.92464: ANSIBALLZ: Writing module into payload 13355 1727096162.92555: ANSIBALLZ: Writing module 13355 1727096162.92583: ANSIBALLZ: Renaming module 13355 1727096162.92590: ANSIBALLZ: Done creating module 13355 1727096162.92610: variable 'ansible_facts' from source: unknown 13355 1727096162.92730: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/AnsiballZ_package_facts.py 13355 1727096162.92835: Sending initial data 13355 1727096162.92838: Sent initial data (162 bytes) 13355 1727096162.93269: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096162.93279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096162.93300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096162.93304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096162.93306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096162.93352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096162.93370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096162.93412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096162.95060: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096162.95086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096162.95118: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpsb0brw4w /root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/AnsiballZ_package_facts.py <<< 13355 1727096162.95121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/AnsiballZ_package_facts.py" <<< 13355 1727096162.95149: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpsb0brw4w" to remote "/root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/AnsiballZ_package_facts.py" <<< 13355 1727096162.95153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/AnsiballZ_package_facts.py" <<< 13355 1727096162.96139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096162.96188: stderr chunk (state=3): >>><<< 13355 1727096162.96191: stdout chunk (state=3): >>><<< 13355 1727096162.96233: done transferring module to remote 13355 1727096162.96242: _low_level_execute_command(): starting 13355 1727096162.96247: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/ /root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/AnsiballZ_package_facts.py && sleep 0' 13355 1727096162.96706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096162.96709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096162.96712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096162.96714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096162.96716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096162.96765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096162.96773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096162.96806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096162.98630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096162.98656: stderr chunk (state=3): >>><<< 13355 1727096162.98660: stdout chunk (state=3): >>><<< 13355 1727096162.98678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096162.98681: _low_level_execute_command(): starting 13355 1727096162.98686: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/AnsiballZ_package_facts.py && sleep 0' 13355 1727096162.99125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096162.99128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096162.99131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096162.99133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096162.99135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096162.99191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096162.99200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096162.99203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096162.99236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096163.44383: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 13355 1727096163.44475: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13355 1727096163.46277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096163.46281: stdout chunk (state=3): >>><<< 13355 1727096163.46284: stderr chunk (state=3): >>><<< 13355 1727096163.46293: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096163.50375: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096163.50384: _low_level_execute_command(): starting 13355 1727096163.50386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096162.5758963-13931-262275975212787/ > /dev/null 2>&1 && sleep 0' 13355 1727096163.51317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096163.51489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096163.51574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096163.51577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096163.51580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096163.51582: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096163.51585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096163.51587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096163.51591: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096163.51593: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096163.51595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096163.51597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096163.51600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096163.51602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096163.51604: stderr chunk (state=3): >>>debug2: match found <<< 13355 1727096163.51608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096163.51976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096163.51979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096163.52074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096163.53913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096163.53966: stderr chunk (state=3): >>><<< 13355 1727096163.54087: stdout chunk (state=3): >>><<< 13355 1727096163.54104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096163.54112: handler run complete 13355 1727096163.56122: variable 'ansible_facts' from source: unknown 13355 1727096163.56928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096163.60999: variable 'ansible_facts' from source: unknown 13355 1727096163.61803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096163.63076: attempt loop complete, returning result 13355 1727096163.63095: _execute() done 13355 1727096163.63098: dumping result to json 13355 1727096163.63714: done dumping result, returning 13355 1727096163.63725: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-c514-593f-000000000284] 13355 1727096163.63730: sending task result for task 0afff68d-5257-c514-593f-000000000284 13355 1727096163.78532: done sending task result for task 0afff68d-5257-c514-593f-000000000284 13355 1727096163.78536: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096163.78636: no more pending results, returning what we have 13355 1727096163.78639: results queue empty 13355 1727096163.78640: checking for any_errors_fatal 13355 1727096163.78644: done checking for any_errors_fatal 13355 1727096163.78645: checking for max_fail_percentage 13355 1727096163.78646: done checking for max_fail_percentage 13355 1727096163.78647: checking to see if all hosts have failed and the running result is not ok 13355 1727096163.78648: done checking to see if all hosts have failed 13355 1727096163.78649: getting the remaining hosts for this loop 13355 1727096163.78650: done getting the remaining hosts for this loop 13355 1727096163.78656: getting the next task for host managed_node3 13355 1727096163.78663: done getting next task for host managed_node3 13355 1727096163.78666: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096163.78671: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096163.78681: getting variables 13355 1727096163.78682: in VariableManager get_vars() 13355 1727096163.78723: Calling all_inventory to load vars for managed_node3 13355 1727096163.78726: Calling groups_inventory to load vars for managed_node3 13355 1727096163.78729: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096163.78737: Calling all_plugins_play to load vars for managed_node3 13355 1727096163.78740: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096163.78743: Calling groups_plugins_play to load vars for managed_node3 13355 1727096163.81320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096163.84800: done with get_vars() 13355 1727096163.84838: done getting variables 13355 1727096163.84999: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:56:03 -0400 (0:00:01.334) 0:00:13.110 ****** 13355 1727096163.85033: entering _queue_task() for managed_node3/debug 13355 1727096163.85809: worker is 1 (out of 1 available) 13355 1727096163.85822: exiting _queue_task() for managed_node3/debug 13355 1727096163.85835: done queuing things up, now waiting for results queue to drain 13355 1727096163.85836: waiting for pending results... 13355 1727096163.86138: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096163.86420: in run() - task 0afff68d-5257-c514-593f-000000000027 13355 1727096163.86774: variable 'ansible_search_path' from source: unknown 13355 1727096163.86777: variable 'ansible_search_path' from source: unknown 13355 1727096163.86781: calling self._execute() 13355 1727096163.86783: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096163.86786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096163.86791: variable 'omit' from source: magic vars 13355 1727096163.87455: variable 'ansible_distribution_major_version' from source: facts 13355 1727096163.87479: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096163.87492: variable 'omit' from source: magic vars 13355 1727096163.87734: variable 'omit' from source: magic vars 13355 1727096163.87843: variable 'network_provider' from source: set_fact 13355 1727096163.87871: variable 'omit' from source: magic vars 13355 1727096163.88115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096163.88373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096163.88376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096163.88379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096163.88381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096163.88383: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096163.88386: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096163.88389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096163.88391: Set connection var ansible_shell_executable to /bin/sh 13355 1727096163.88393: Set connection var ansible_shell_type to sh 13355 1727096163.88395: Set connection var ansible_pipelining to False 13355 1727096163.88398: Set connection var ansible_connection to ssh 13355 1727096163.88400: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096163.88402: Set connection var ansible_timeout to 10 13355 1727096163.88591: variable 'ansible_shell_executable' from source: unknown 13355 1727096163.88600: variable 'ansible_connection' from source: unknown 13355 1727096163.88608: variable 'ansible_module_compression' from source: unknown 13355 1727096163.88615: variable 'ansible_shell_type' from source: unknown 13355 1727096163.88622: variable 'ansible_shell_executable' from source: unknown 13355 1727096163.88629: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096163.88638: variable 'ansible_pipelining' from source: unknown 13355 1727096163.88644: variable 'ansible_timeout' from source: unknown 13355 1727096163.88652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096163.88988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096163.89010: variable 'omit' from source: magic vars 13355 1727096163.89020: starting attempt loop 13355 1727096163.89027: running the handler 13355 1727096163.89085: handler run complete 13355 1727096163.89373: attempt loop complete, returning result 13355 1727096163.89377: _execute() done 13355 1727096163.89379: dumping result to json 13355 1727096163.89382: done dumping result, returning 13355 1727096163.89384: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-c514-593f-000000000027] 13355 1727096163.89387: sending task result for task 0afff68d-5257-c514-593f-000000000027 13355 1727096163.89458: done sending task result for task 0afff68d-5257-c514-593f-000000000027 13355 1727096163.89461: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 13355 1727096163.89527: no more pending results, returning what we have 13355 1727096163.89530: results queue empty 13355 1727096163.89531: checking for any_errors_fatal 13355 1727096163.89539: done checking for any_errors_fatal 13355 1727096163.89540: checking for max_fail_percentage 13355 1727096163.89542: done checking for max_fail_percentage 13355 1727096163.89543: checking to see if all hosts have failed and the running result is not ok 13355 1727096163.89543: done checking to see if all hosts have failed 13355 1727096163.89544: getting the remaining hosts for this loop 13355 1727096163.89545: done getting the remaining hosts for this loop 13355 1727096163.89550: getting the next task for host managed_node3 13355 1727096163.89557: done getting next task for host managed_node3 13355 1727096163.89561: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096163.89564: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096163.89579: getting variables 13355 1727096163.89581: in VariableManager get_vars() 13355 1727096163.89638: Calling all_inventory to load vars for managed_node3 13355 1727096163.89641: Calling groups_inventory to load vars for managed_node3 13355 1727096163.89644: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096163.89656: Calling all_plugins_play to load vars for managed_node3 13355 1727096163.89659: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096163.89662: Calling groups_plugins_play to load vars for managed_node3 13355 1727096163.92318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096163.94209: done with get_vars() 13355 1727096163.94255: done getting variables 13355 1727096163.94358: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:56:03 -0400 (0:00:00.093) 0:00:13.204 ****** 13355 1727096163.94396: entering _queue_task() for managed_node3/fail 13355 1727096163.94398: Creating lock for fail 13355 1727096163.94866: worker is 1 (out of 1 available) 13355 1727096163.94882: exiting _queue_task() for managed_node3/fail 13355 1727096163.94903: done queuing things up, now waiting for results queue to drain 13355 1727096163.94904: waiting for pending results... 13355 1727096163.95686: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096163.95692: in run() - task 0afff68d-5257-c514-593f-000000000028 13355 1727096163.95696: variable 'ansible_search_path' from source: unknown 13355 1727096163.95698: variable 'ansible_search_path' from source: unknown 13355 1727096163.95792: calling self._execute() 13355 1727096163.95891: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096163.96086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096163.96103: variable 'omit' from source: magic vars 13355 1727096163.96720: variable 'ansible_distribution_major_version' from source: facts 13355 1727096163.96780: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096163.96918: variable 'network_state' from source: role '' defaults 13355 1727096163.96963: Evaluated conditional (network_state != {}): False 13355 1727096163.96977: when evaluation is False, skipping this task 13355 1727096163.96983: _execute() done 13355 1727096163.96991: dumping result to json 13355 1727096163.96999: done dumping result, returning 13355 1727096163.97013: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-c514-593f-000000000028] 13355 1727096163.97023: sending task result for task 0afff68d-5257-c514-593f-000000000028 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096163.97192: no more pending results, returning what we have 13355 1727096163.97196: results queue empty 13355 1727096163.97197: checking for any_errors_fatal 13355 1727096163.97312: done checking for any_errors_fatal 13355 1727096163.97314: checking for max_fail_percentage 13355 1727096163.97316: done checking for max_fail_percentage 13355 1727096163.97317: checking to see if all hosts have failed and the running result is not ok 13355 1727096163.97317: done checking to see if all hosts have failed 13355 1727096163.97318: getting the remaining hosts for this loop 13355 1727096163.97319: done getting the remaining hosts for this loop 13355 1727096163.97323: getting the next task for host managed_node3 13355 1727096163.97330: done getting next task for host managed_node3 13355 1727096163.97335: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096163.97338: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096163.97356: getting variables 13355 1727096163.97358: in VariableManager get_vars() 13355 1727096163.97405: Calling all_inventory to load vars for managed_node3 13355 1727096163.97408: Calling groups_inventory to load vars for managed_node3 13355 1727096163.97410: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096163.97420: Calling all_plugins_play to load vars for managed_node3 13355 1727096163.97422: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096163.97425: Calling groups_plugins_play to load vars for managed_node3 13355 1727096163.97946: done sending task result for task 0afff68d-5257-c514-593f-000000000028 13355 1727096163.97950: WORKER PROCESS EXITING 13355 1727096163.98767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096164.01887: done with get_vars() 13355 1727096164.02039: done getting variables 13355 1727096164.02102: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:56:04 -0400 (0:00:00.077) 0:00:13.281 ****** 13355 1727096164.02251: entering _queue_task() for managed_node3/fail 13355 1727096164.02926: worker is 1 (out of 1 available) 13355 1727096164.02937: exiting _queue_task() for managed_node3/fail 13355 1727096164.02949: done queuing things up, now waiting for results queue to drain 13355 1727096164.02950: waiting for pending results... 13355 1727096164.03353: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096164.03669: in run() - task 0afff68d-5257-c514-593f-000000000029 13355 1727096164.03674: variable 'ansible_search_path' from source: unknown 13355 1727096164.03677: variable 'ansible_search_path' from source: unknown 13355 1727096164.03680: calling self._execute() 13355 1727096164.04009: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096164.04013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096164.04024: variable 'omit' from source: magic vars 13355 1727096164.04812: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.04825: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096164.05193: variable 'network_state' from source: role '' defaults 13355 1727096164.05205: Evaluated conditional (network_state != {}): False 13355 1727096164.05208: when evaluation is False, skipping this task 13355 1727096164.05211: _execute() done 13355 1727096164.05214: dumping result to json 13355 1727096164.05216: done dumping result, returning 13355 1727096164.05227: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-c514-593f-000000000029] 13355 1727096164.05230: sending task result for task 0afff68d-5257-c514-593f-000000000029 13355 1727096164.05363: done sending task result for task 0afff68d-5257-c514-593f-000000000029 13355 1727096164.05366: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096164.05440: no more pending results, returning what we have 13355 1727096164.05444: results queue empty 13355 1727096164.05445: checking for any_errors_fatal 13355 1727096164.05453: done checking for any_errors_fatal 13355 1727096164.05453: checking for max_fail_percentage 13355 1727096164.05455: done checking for max_fail_percentage 13355 1727096164.05456: checking to see if all hosts have failed and the running result is not ok 13355 1727096164.05457: done checking to see if all hosts have failed 13355 1727096164.05457: getting the remaining hosts for this loop 13355 1727096164.05458: done getting the remaining hosts for this loop 13355 1727096164.05463: getting the next task for host managed_node3 13355 1727096164.05471: done getting next task for host managed_node3 13355 1727096164.05475: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096164.05478: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096164.05494: getting variables 13355 1727096164.05495: in VariableManager get_vars() 13355 1727096164.05552: Calling all_inventory to load vars for managed_node3 13355 1727096164.05555: Calling groups_inventory to load vars for managed_node3 13355 1727096164.05558: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096164.05673: Calling all_plugins_play to load vars for managed_node3 13355 1727096164.05682: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096164.05685: Calling groups_plugins_play to load vars for managed_node3 13355 1727096164.08597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096164.11953: done with get_vars() 13355 1727096164.12025: done getting variables 13355 1727096164.12089: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:56:04 -0400 (0:00:00.101) 0:00:13.383 ****** 13355 1727096164.12241: entering _queue_task() for managed_node3/fail 13355 1727096164.12992: worker is 1 (out of 1 available) 13355 1727096164.13005: exiting _queue_task() for managed_node3/fail 13355 1727096164.13017: done queuing things up, now waiting for results queue to drain 13355 1727096164.13019: waiting for pending results... 13355 1727096164.13687: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096164.13693: in run() - task 0afff68d-5257-c514-593f-00000000002a 13355 1727096164.14074: variable 'ansible_search_path' from source: unknown 13355 1727096164.14078: variable 'ansible_search_path' from source: unknown 13355 1727096164.14081: calling self._execute() 13355 1727096164.14084: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096164.14086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096164.14089: variable 'omit' from source: magic vars 13355 1727096164.14815: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.15273: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096164.15278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096164.19800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096164.20056: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096164.20101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096164.20212: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096164.20244: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096164.20450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.20491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.20873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.20876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.20879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.20881: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.21272: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13355 1727096164.21276: variable 'ansible_distribution' from source: facts 13355 1727096164.21279: variable '__network_rh_distros' from source: role '' defaults 13355 1727096164.21281: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13355 1727096164.21890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.21922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.21951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.22002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.22024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.22472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.22476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.22479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.22481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.22483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.22485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.22488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.22502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.22547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.22572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.22902: variable 'network_connections' from source: task vars 13355 1727096164.22922: variable 'controller_profile' from source: play vars 13355 1727096164.22996: variable 'controller_profile' from source: play vars 13355 1727096164.23010: variable 'controller_device' from source: play vars 13355 1727096164.23077: variable 'controller_device' from source: play vars 13355 1727096164.23093: variable 'port1_profile' from source: play vars 13355 1727096164.23235: variable 'port1_profile' from source: play vars 13355 1727096164.23238: variable 'dhcp_interface1' from source: play vars 13355 1727096164.23240: variable 'dhcp_interface1' from source: play vars 13355 1727096164.23242: variable 'controller_profile' from source: play vars 13355 1727096164.23299: variable 'controller_profile' from source: play vars 13355 1727096164.23310: variable 'port2_profile' from source: play vars 13355 1727096164.23379: variable 'port2_profile' from source: play vars 13355 1727096164.23391: variable 'dhcp_interface2' from source: play vars 13355 1727096164.23456: variable 'dhcp_interface2' from source: play vars 13355 1727096164.23469: variable 'controller_profile' from source: play vars 13355 1727096164.23537: variable 'controller_profile' from source: play vars 13355 1727096164.23550: variable 'network_state' from source: role '' defaults 13355 1727096164.23626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096164.23813: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096164.23882: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096164.23899: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096164.23931: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096164.24004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096164.24072: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096164.24075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.24090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096164.24137: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13355 1727096164.24145: when evaluation is False, skipping this task 13355 1727096164.24151: _execute() done 13355 1727096164.24164: dumping result to json 13355 1727096164.24172: done dumping result, returning 13355 1727096164.24184: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-c514-593f-00000000002a] 13355 1727096164.24192: sending task result for task 0afff68d-5257-c514-593f-00000000002a 13355 1727096164.24573: done sending task result for task 0afff68d-5257-c514-593f-00000000002a 13355 1727096164.24576: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13355 1727096164.24615: no more pending results, returning what we have 13355 1727096164.24618: results queue empty 13355 1727096164.24618: checking for any_errors_fatal 13355 1727096164.24623: done checking for any_errors_fatal 13355 1727096164.24623: checking for max_fail_percentage 13355 1727096164.24625: done checking for max_fail_percentage 13355 1727096164.24625: checking to see if all hosts have failed and the running result is not ok 13355 1727096164.24626: done checking to see if all hosts have failed 13355 1727096164.24627: getting the remaining hosts for this loop 13355 1727096164.24629: done getting the remaining hosts for this loop 13355 1727096164.24632: getting the next task for host managed_node3 13355 1727096164.24637: done getting next task for host managed_node3 13355 1727096164.24641: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096164.24644: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096164.24659: getting variables 13355 1727096164.24660: in VariableManager get_vars() 13355 1727096164.24709: Calling all_inventory to load vars for managed_node3 13355 1727096164.24712: Calling groups_inventory to load vars for managed_node3 13355 1727096164.24714: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096164.24723: Calling all_plugins_play to load vars for managed_node3 13355 1727096164.24726: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096164.24728: Calling groups_plugins_play to load vars for managed_node3 13355 1727096164.26032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096164.27663: done with get_vars() 13355 1727096164.27689: done getting variables 13355 1727096164.27788: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:56:04 -0400 (0:00:00.155) 0:00:13.538 ****** 13355 1727096164.27820: entering _queue_task() for managed_node3/dnf 13355 1727096164.28169: worker is 1 (out of 1 available) 13355 1727096164.28182: exiting _queue_task() for managed_node3/dnf 13355 1727096164.28196: done queuing things up, now waiting for results queue to drain 13355 1727096164.28197: waiting for pending results... 13355 1727096164.28476: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096164.28623: in run() - task 0afff68d-5257-c514-593f-00000000002b 13355 1727096164.28646: variable 'ansible_search_path' from source: unknown 13355 1727096164.28660: variable 'ansible_search_path' from source: unknown 13355 1727096164.28708: calling self._execute() 13355 1727096164.28797: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096164.28815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096164.28829: variable 'omit' from source: magic vars 13355 1727096164.29213: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.29245: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096164.29463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096164.31683: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096164.31761: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096164.31808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096164.31874: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096164.31882: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096164.31966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.32008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.32173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.32177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.32179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.32224: variable 'ansible_distribution' from source: facts 13355 1727096164.32236: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.32257: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13355 1727096164.32380: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096164.32524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.32557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.32589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.32639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.32660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.32705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.32739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.32773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.32816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.32838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.32885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.32947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.32951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.32992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.33011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.33184: variable 'network_connections' from source: task vars 13355 1727096164.33203: variable 'controller_profile' from source: play vars 13355 1727096164.33372: variable 'controller_profile' from source: play vars 13355 1727096164.33375: variable 'controller_device' from source: play vars 13355 1727096164.33379: variable 'controller_device' from source: play vars 13355 1727096164.33381: variable 'port1_profile' from source: play vars 13355 1727096164.33427: variable 'port1_profile' from source: play vars 13355 1727096164.33441: variable 'dhcp_interface1' from source: play vars 13355 1727096164.33511: variable 'dhcp_interface1' from source: play vars 13355 1727096164.33524: variable 'controller_profile' from source: play vars 13355 1727096164.33590: variable 'controller_profile' from source: play vars 13355 1727096164.33607: variable 'port2_profile' from source: play vars 13355 1727096164.33670: variable 'port2_profile' from source: play vars 13355 1727096164.33713: variable 'dhcp_interface2' from source: play vars 13355 1727096164.33747: variable 'dhcp_interface2' from source: play vars 13355 1727096164.33764: variable 'controller_profile' from source: play vars 13355 1727096164.33831: variable 'controller_profile' from source: play vars 13355 1727096164.33931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096164.34149: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096164.34159: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096164.34196: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096164.34231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096164.34287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096164.34333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096164.34372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.34474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096164.34478: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096164.34709: variable 'network_connections' from source: task vars 13355 1727096164.34719: variable 'controller_profile' from source: play vars 13355 1727096164.34781: variable 'controller_profile' from source: play vars 13355 1727096164.34795: variable 'controller_device' from source: play vars 13355 1727096164.34858: variable 'controller_device' from source: play vars 13355 1727096164.34910: variable 'port1_profile' from source: play vars 13355 1727096164.34938: variable 'port1_profile' from source: play vars 13355 1727096164.34949: variable 'dhcp_interface1' from source: play vars 13355 1727096164.35015: variable 'dhcp_interface1' from source: play vars 13355 1727096164.35030: variable 'controller_profile' from source: play vars 13355 1727096164.35131: variable 'controller_profile' from source: play vars 13355 1727096164.35134: variable 'port2_profile' from source: play vars 13355 1727096164.35171: variable 'port2_profile' from source: play vars 13355 1727096164.35184: variable 'dhcp_interface2' from source: play vars 13355 1727096164.35246: variable 'dhcp_interface2' from source: play vars 13355 1727096164.35260: variable 'controller_profile' from source: play vars 13355 1727096164.35322: variable 'controller_profile' from source: play vars 13355 1727096164.35369: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096164.35460: when evaluation is False, skipping this task 13355 1727096164.35463: _execute() done 13355 1727096164.35465: dumping result to json 13355 1727096164.35468: done dumping result, returning 13355 1727096164.35471: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-00000000002b] 13355 1727096164.35473: sending task result for task 0afff68d-5257-c514-593f-00000000002b 13355 1727096164.35542: done sending task result for task 0afff68d-5257-c514-593f-00000000002b 13355 1727096164.35545: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096164.35614: no more pending results, returning what we have 13355 1727096164.35618: results queue empty 13355 1727096164.35619: checking for any_errors_fatal 13355 1727096164.35625: done checking for any_errors_fatal 13355 1727096164.35626: checking for max_fail_percentage 13355 1727096164.35628: done checking for max_fail_percentage 13355 1727096164.35628: checking to see if all hosts have failed and the running result is not ok 13355 1727096164.35629: done checking to see if all hosts have failed 13355 1727096164.35630: getting the remaining hosts for this loop 13355 1727096164.35631: done getting the remaining hosts for this loop 13355 1727096164.35635: getting the next task for host managed_node3 13355 1727096164.35641: done getting next task for host managed_node3 13355 1727096164.35645: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096164.35648: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096164.35668: getting variables 13355 1727096164.35670: in VariableManager get_vars() 13355 1727096164.35725: Calling all_inventory to load vars for managed_node3 13355 1727096164.35729: Calling groups_inventory to load vars for managed_node3 13355 1727096164.35731: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096164.35741: Calling all_plugins_play to load vars for managed_node3 13355 1727096164.35745: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096164.35748: Calling groups_plugins_play to load vars for managed_node3 13355 1727096164.37313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096164.38848: done with get_vars() 13355 1727096164.38885: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096164.38960: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:56:04 -0400 (0:00:00.111) 0:00:13.650 ****** 13355 1727096164.38995: entering _queue_task() for managed_node3/yum 13355 1727096164.38997: Creating lock for yum 13355 1727096164.39351: worker is 1 (out of 1 available) 13355 1727096164.39366: exiting _queue_task() for managed_node3/yum 13355 1727096164.39380: done queuing things up, now waiting for results queue to drain 13355 1727096164.39382: waiting for pending results... 13355 1727096164.39789: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096164.39794: in run() - task 0afff68d-5257-c514-593f-00000000002c 13355 1727096164.39816: variable 'ansible_search_path' from source: unknown 13355 1727096164.39825: variable 'ansible_search_path' from source: unknown 13355 1727096164.39869: calling self._execute() 13355 1727096164.39964: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096164.39980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096164.40000: variable 'omit' from source: magic vars 13355 1727096164.40377: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.40395: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096164.40582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096164.42917: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096164.43012: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096164.43062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096164.43108: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096164.43137: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096164.43233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.43277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.43314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.43362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.43392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.43505: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.43532: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13355 1727096164.43572: when evaluation is False, skipping this task 13355 1727096164.43575: _execute() done 13355 1727096164.43578: dumping result to json 13355 1727096164.43580: done dumping result, returning 13355 1727096164.43582: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-00000000002c] 13355 1727096164.43587: sending task result for task 0afff68d-5257-c514-593f-00000000002c skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13355 1727096164.43896: no more pending results, returning what we have 13355 1727096164.43901: results queue empty 13355 1727096164.43902: checking for any_errors_fatal 13355 1727096164.43908: done checking for any_errors_fatal 13355 1727096164.43909: checking for max_fail_percentage 13355 1727096164.43911: done checking for max_fail_percentage 13355 1727096164.43912: checking to see if all hosts have failed and the running result is not ok 13355 1727096164.43913: done checking to see if all hosts have failed 13355 1727096164.43913: getting the remaining hosts for this loop 13355 1727096164.43915: done getting the remaining hosts for this loop 13355 1727096164.43919: getting the next task for host managed_node3 13355 1727096164.43926: done getting next task for host managed_node3 13355 1727096164.43930: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096164.43933: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096164.43948: getting variables 13355 1727096164.43949: in VariableManager get_vars() 13355 1727096164.44011: Calling all_inventory to load vars for managed_node3 13355 1727096164.44014: Calling groups_inventory to load vars for managed_node3 13355 1727096164.44016: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096164.44027: Calling all_plugins_play to load vars for managed_node3 13355 1727096164.44030: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096164.44033: Calling groups_plugins_play to load vars for managed_node3 13355 1727096164.44615: done sending task result for task 0afff68d-5257-c514-593f-00000000002c 13355 1727096164.44619: WORKER PROCESS EXITING 13355 1727096164.45777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096164.47378: done with get_vars() 13355 1727096164.47407: done getting variables 13355 1727096164.47480: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:56:04 -0400 (0:00:00.085) 0:00:13.735 ****** 13355 1727096164.47515: entering _queue_task() for managed_node3/fail 13355 1727096164.47986: worker is 1 (out of 1 available) 13355 1727096164.47997: exiting _queue_task() for managed_node3/fail 13355 1727096164.48009: done queuing things up, now waiting for results queue to drain 13355 1727096164.48010: waiting for pending results... 13355 1727096164.48201: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096164.48358: in run() - task 0afff68d-5257-c514-593f-00000000002d 13355 1727096164.48428: variable 'ansible_search_path' from source: unknown 13355 1727096164.48432: variable 'ansible_search_path' from source: unknown 13355 1727096164.48435: calling self._execute() 13355 1727096164.48526: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096164.48543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096164.48564: variable 'omit' from source: magic vars 13355 1727096164.48973: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.49072: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096164.49164: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096164.49378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096164.51713: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096164.51800: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096164.51846: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096164.51900: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096164.51926: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096164.52025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.52118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.52121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.52139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.52161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.52215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.52251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.52288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.52334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.52357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.52409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.52442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.52554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.52558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.52560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.52731: variable 'network_connections' from source: task vars 13355 1727096164.52766: variable 'controller_profile' from source: play vars 13355 1727096164.52842: variable 'controller_profile' from source: play vars 13355 1727096164.52865: variable 'controller_device' from source: play vars 13355 1727096164.53075: variable 'controller_device' from source: play vars 13355 1727096164.53079: variable 'port1_profile' from source: play vars 13355 1727096164.53082: variable 'port1_profile' from source: play vars 13355 1727096164.53092: variable 'dhcp_interface1' from source: play vars 13355 1727096164.53162: variable 'dhcp_interface1' from source: play vars 13355 1727096164.53189: variable 'controller_profile' from source: play vars 13355 1727096164.53256: variable 'controller_profile' from source: play vars 13355 1727096164.53272: variable 'port2_profile' from source: play vars 13355 1727096164.53341: variable 'port2_profile' from source: play vars 13355 1727096164.53357: variable 'dhcp_interface2' from source: play vars 13355 1727096164.53431: variable 'dhcp_interface2' from source: play vars 13355 1727096164.53444: variable 'controller_profile' from source: play vars 13355 1727096164.53515: variable 'controller_profile' from source: play vars 13355 1727096164.53600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096164.53842: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096164.53850: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096164.53893: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096164.53927: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096164.53990: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096164.54017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096164.54048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.54091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096164.54372: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096164.54445: variable 'network_connections' from source: task vars 13355 1727096164.54459: variable 'controller_profile' from source: play vars 13355 1727096164.54530: variable 'controller_profile' from source: play vars 13355 1727096164.54543: variable 'controller_device' from source: play vars 13355 1727096164.54615: variable 'controller_device' from source: play vars 13355 1727096164.54631: variable 'port1_profile' from source: play vars 13355 1727096164.54696: variable 'port1_profile' from source: play vars 13355 1727096164.54715: variable 'dhcp_interface1' from source: play vars 13355 1727096164.54796: variable 'dhcp_interface1' from source: play vars 13355 1727096164.54846: variable 'controller_profile' from source: play vars 13355 1727096164.54910: variable 'controller_profile' from source: play vars 13355 1727096164.54931: variable 'port2_profile' from source: play vars 13355 1727096164.55020: variable 'port2_profile' from source: play vars 13355 1727096164.55048: variable 'dhcp_interface2' from source: play vars 13355 1727096164.55109: variable 'dhcp_interface2' from source: play vars 13355 1727096164.55121: variable 'controller_profile' from source: play vars 13355 1727096164.55191: variable 'controller_profile' from source: play vars 13355 1727096164.55232: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096164.55240: when evaluation is False, skipping this task 13355 1727096164.55257: _execute() done 13355 1727096164.55265: dumping result to json 13355 1727096164.55274: done dumping result, returning 13355 1727096164.55364: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-00000000002d] 13355 1727096164.55371: sending task result for task 0afff68d-5257-c514-593f-00000000002d 13355 1727096164.55444: done sending task result for task 0afff68d-5257-c514-593f-00000000002d 13355 1727096164.55447: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096164.55523: no more pending results, returning what we have 13355 1727096164.55528: results queue empty 13355 1727096164.55529: checking for any_errors_fatal 13355 1727096164.55534: done checking for any_errors_fatal 13355 1727096164.55535: checking for max_fail_percentage 13355 1727096164.55537: done checking for max_fail_percentage 13355 1727096164.55537: checking to see if all hosts have failed and the running result is not ok 13355 1727096164.55538: done checking to see if all hosts have failed 13355 1727096164.55539: getting the remaining hosts for this loop 13355 1727096164.55540: done getting the remaining hosts for this loop 13355 1727096164.55544: getting the next task for host managed_node3 13355 1727096164.55551: done getting next task for host managed_node3 13355 1727096164.55558: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13355 1727096164.55561: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096164.55674: getting variables 13355 1727096164.55676: in VariableManager get_vars() 13355 1727096164.55738: Calling all_inventory to load vars for managed_node3 13355 1727096164.55741: Calling groups_inventory to load vars for managed_node3 13355 1727096164.55743: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096164.55757: Calling all_plugins_play to load vars for managed_node3 13355 1727096164.55761: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096164.55764: Calling groups_plugins_play to load vars for managed_node3 13355 1727096164.58250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096164.59865: done with get_vars() 13355 1727096164.59903: done getting variables 13355 1727096164.59966: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:56:04 -0400 (0:00:00.124) 0:00:13.860 ****** 13355 1727096164.60008: entering _queue_task() for managed_node3/package 13355 1727096164.60480: worker is 1 (out of 1 available) 13355 1727096164.60491: exiting _queue_task() for managed_node3/package 13355 1727096164.60504: done queuing things up, now waiting for results queue to drain 13355 1727096164.60505: waiting for pending results... 13355 1727096164.60743: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13355 1727096164.60875: in run() - task 0afff68d-5257-c514-593f-00000000002e 13355 1727096164.60880: variable 'ansible_search_path' from source: unknown 13355 1727096164.60883: variable 'ansible_search_path' from source: unknown 13355 1727096164.60946: calling self._execute() 13355 1727096164.61066: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096164.61071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096164.61104: variable 'omit' from source: magic vars 13355 1727096164.61622: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.61734: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096164.62081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096164.62660: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096164.62711: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096164.62773: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096164.62880: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096164.63193: variable 'network_packages' from source: role '' defaults 13355 1727096164.63432: variable '__network_provider_setup' from source: role '' defaults 13355 1727096164.63451: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096164.63564: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096164.63819: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096164.63823: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096164.64254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096164.74965: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096164.75053: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096164.75097: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096164.75136: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096164.75185: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096164.75261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.75375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.75379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.75381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.75396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.75446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.75475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.75554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.75557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.75563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.75797: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096164.75930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.75957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.75991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.76039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.76058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.76156: variable 'ansible_python' from source: facts 13355 1727096164.76207: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096164.76286: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096164.76377: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096164.76533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.76537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.76560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.76609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.76629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.76688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096164.76750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096164.76753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.76787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096164.76804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096164.76955: variable 'network_connections' from source: task vars 13355 1727096164.77000: variable 'controller_profile' from source: play vars 13355 1727096164.77171: variable 'controller_profile' from source: play vars 13355 1727096164.77192: variable 'controller_device' from source: play vars 13355 1727096164.77455: variable 'controller_device' from source: play vars 13355 1727096164.77461: variable 'port1_profile' from source: play vars 13355 1727096164.77715: variable 'port1_profile' from source: play vars 13355 1727096164.77736: variable 'dhcp_interface1' from source: play vars 13355 1727096164.78037: variable 'dhcp_interface1' from source: play vars 13355 1727096164.78044: variable 'controller_profile' from source: play vars 13355 1727096164.78193: variable 'controller_profile' from source: play vars 13355 1727096164.78209: variable 'port2_profile' from source: play vars 13355 1727096164.78327: variable 'port2_profile' from source: play vars 13355 1727096164.78342: variable 'dhcp_interface2' from source: play vars 13355 1727096164.78463: variable 'dhcp_interface2' from source: play vars 13355 1727096164.78577: variable 'controller_profile' from source: play vars 13355 1727096164.78794: variable 'controller_profile' from source: play vars 13355 1727096164.79066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096164.79094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096164.79307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096164.79312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096164.79357: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096164.80272: variable 'network_connections' from source: task vars 13355 1727096164.80415: variable 'controller_profile' from source: play vars 13355 1727096164.80509: variable 'controller_profile' from source: play vars 13355 1727096164.80528: variable 'controller_device' from source: play vars 13355 1727096164.80718: variable 'controller_device' from source: play vars 13355 1727096164.80727: variable 'port1_profile' from source: play vars 13355 1727096164.80985: variable 'port1_profile' from source: play vars 13355 1727096164.81176: variable 'dhcp_interface1' from source: play vars 13355 1727096164.81179: variable 'dhcp_interface1' from source: play vars 13355 1727096164.81184: variable 'controller_profile' from source: play vars 13355 1727096164.81402: variable 'controller_profile' from source: play vars 13355 1727096164.81481: variable 'port2_profile' from source: play vars 13355 1727096164.81640: variable 'port2_profile' from source: play vars 13355 1727096164.81654: variable 'dhcp_interface2' from source: play vars 13355 1727096164.81837: variable 'dhcp_interface2' from source: play vars 13355 1727096164.81954: variable 'controller_profile' from source: play vars 13355 1727096164.81959: variable 'controller_profile' from source: play vars 13355 1727096164.82041: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096164.82125: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096164.82610: variable 'network_connections' from source: task vars 13355 1727096164.82629: variable 'controller_profile' from source: play vars 13355 1727096164.82696: variable 'controller_profile' from source: play vars 13355 1727096164.82709: variable 'controller_device' from source: play vars 13355 1727096164.82956: variable 'controller_device' from source: play vars 13355 1727096164.82959: variable 'port1_profile' from source: play vars 13355 1727096164.82983: variable 'port1_profile' from source: play vars 13355 1727096164.82994: variable 'dhcp_interface1' from source: play vars 13355 1727096164.83060: variable 'dhcp_interface1' from source: play vars 13355 1727096164.83075: variable 'controller_profile' from source: play vars 13355 1727096164.83134: variable 'controller_profile' from source: play vars 13355 1727096164.83147: variable 'port2_profile' from source: play vars 13355 1727096164.83215: variable 'port2_profile' from source: play vars 13355 1727096164.83273: variable 'dhcp_interface2' from source: play vars 13355 1727096164.83296: variable 'dhcp_interface2' from source: play vars 13355 1727096164.83390: variable 'controller_profile' from source: play vars 13355 1727096164.83392: variable 'controller_profile' from source: play vars 13355 1727096164.83405: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096164.83485: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096164.83839: variable 'network_connections' from source: task vars 13355 1727096164.83851: variable 'controller_profile' from source: play vars 13355 1727096164.83922: variable 'controller_profile' from source: play vars 13355 1727096164.83944: variable 'controller_device' from source: play vars 13355 1727096164.84014: variable 'controller_device' from source: play vars 13355 1727096164.84028: variable 'port1_profile' from source: play vars 13355 1727096164.84106: variable 'port1_profile' from source: play vars 13355 1727096164.84153: variable 'dhcp_interface1' from source: play vars 13355 1727096164.84190: variable 'dhcp_interface1' from source: play vars 13355 1727096164.84203: variable 'controller_profile' from source: play vars 13355 1727096164.84304: variable 'controller_profile' from source: play vars 13355 1727096164.84317: variable 'port2_profile' from source: play vars 13355 1727096164.84391: variable 'port2_profile' from source: play vars 13355 1727096164.84473: variable 'dhcp_interface2' from source: play vars 13355 1727096164.84477: variable 'dhcp_interface2' from source: play vars 13355 1727096164.84480: variable 'controller_profile' from source: play vars 13355 1727096164.84544: variable 'controller_profile' from source: play vars 13355 1727096164.84624: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096164.84688: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096164.84872: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096164.84875: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096164.84980: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096164.85518: variable 'network_connections' from source: task vars 13355 1727096164.85575: variable 'controller_profile' from source: play vars 13355 1727096164.85694: variable 'controller_profile' from source: play vars 13355 1727096164.85782: variable 'controller_device' from source: play vars 13355 1727096164.85940: variable 'controller_device' from source: play vars 13355 1727096164.86014: variable 'port1_profile' from source: play vars 13355 1727096164.86199: variable 'port1_profile' from source: play vars 13355 1727096164.86202: variable 'dhcp_interface1' from source: play vars 13355 1727096164.86377: variable 'dhcp_interface1' from source: play vars 13355 1727096164.86415: variable 'controller_profile' from source: play vars 13355 1727096164.86584: variable 'controller_profile' from source: play vars 13355 1727096164.86596: variable 'port2_profile' from source: play vars 13355 1727096164.86727: variable 'port2_profile' from source: play vars 13355 1727096164.86730: variable 'dhcp_interface2' from source: play vars 13355 1727096164.86857: variable 'dhcp_interface2' from source: play vars 13355 1727096164.86873: variable 'controller_profile' from source: play vars 13355 1727096164.87015: variable 'controller_profile' from source: play vars 13355 1727096164.87069: variable 'ansible_distribution' from source: facts 13355 1727096164.87162: variable '__network_rh_distros' from source: role '' defaults 13355 1727096164.87165: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.87169: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096164.87509: variable 'ansible_distribution' from source: facts 13355 1727096164.87512: variable '__network_rh_distros' from source: role '' defaults 13355 1727096164.87515: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.87544: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096164.87942: variable 'ansible_distribution' from source: facts 13355 1727096164.88081: variable '__network_rh_distros' from source: role '' defaults 13355 1727096164.88084: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.88086: variable 'network_provider' from source: set_fact 13355 1727096164.88089: variable 'ansible_facts' from source: unknown 13355 1727096164.88774: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13355 1727096164.88777: when evaluation is False, skipping this task 13355 1727096164.88780: _execute() done 13355 1727096164.88784: dumping result to json 13355 1727096164.88786: done dumping result, returning 13355 1727096164.88792: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-c514-593f-00000000002e] 13355 1727096164.88795: sending task result for task 0afff68d-5257-c514-593f-00000000002e 13355 1727096164.88888: done sending task result for task 0afff68d-5257-c514-593f-00000000002e 13355 1727096164.88891: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13355 1727096164.88939: no more pending results, returning what we have 13355 1727096164.88941: results queue empty 13355 1727096164.88942: checking for any_errors_fatal 13355 1727096164.88947: done checking for any_errors_fatal 13355 1727096164.88948: checking for max_fail_percentage 13355 1727096164.88950: done checking for max_fail_percentage 13355 1727096164.88950: checking to see if all hosts have failed and the running result is not ok 13355 1727096164.88951: done checking to see if all hosts have failed 13355 1727096164.88954: getting the remaining hosts for this loop 13355 1727096164.88955: done getting the remaining hosts for this loop 13355 1727096164.88959: getting the next task for host managed_node3 13355 1727096164.88964: done getting next task for host managed_node3 13355 1727096164.88973: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096164.88976: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096164.88990: getting variables 13355 1727096164.88991: in VariableManager get_vars() 13355 1727096164.89040: Calling all_inventory to load vars for managed_node3 13355 1727096164.89043: Calling groups_inventory to load vars for managed_node3 13355 1727096164.89045: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096164.89055: Calling all_plugins_play to load vars for managed_node3 13355 1727096164.89058: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096164.89060: Calling groups_plugins_play to load vars for managed_node3 13355 1727096164.93952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096164.95004: done with get_vars() 13355 1727096164.95027: done getting variables 13355 1727096164.95070: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:56:04 -0400 (0:00:00.350) 0:00:14.211 ****** 13355 1727096164.95093: entering _queue_task() for managed_node3/package 13355 1727096164.95422: worker is 1 (out of 1 available) 13355 1727096164.95435: exiting _queue_task() for managed_node3/package 13355 1727096164.95448: done queuing things up, now waiting for results queue to drain 13355 1727096164.95450: waiting for pending results... 13355 1727096164.95681: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096164.95932: in run() - task 0afff68d-5257-c514-593f-00000000002f 13355 1727096164.95941: variable 'ansible_search_path' from source: unknown 13355 1727096164.95944: variable 'ansible_search_path' from source: unknown 13355 1727096164.95947: calling self._execute() 13355 1727096164.96191: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096164.96197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096164.96201: variable 'omit' from source: magic vars 13355 1727096164.96526: variable 'ansible_distribution_major_version' from source: facts 13355 1727096164.96558: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096164.96694: variable 'network_state' from source: role '' defaults 13355 1727096164.96705: Evaluated conditional (network_state != {}): False 13355 1727096164.96709: when evaluation is False, skipping this task 13355 1727096164.96713: _execute() done 13355 1727096164.96715: dumping result to json 13355 1727096164.96718: done dumping result, returning 13355 1727096164.96747: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-c514-593f-00000000002f] 13355 1727096164.96751: sending task result for task 0afff68d-5257-c514-593f-00000000002f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096164.96994: no more pending results, returning what we have 13355 1727096164.96997: results queue empty 13355 1727096164.96998: checking for any_errors_fatal 13355 1727096164.97003: done checking for any_errors_fatal 13355 1727096164.97004: checking for max_fail_percentage 13355 1727096164.97005: done checking for max_fail_percentage 13355 1727096164.97006: checking to see if all hosts have failed and the running result is not ok 13355 1727096164.97007: done checking to see if all hosts have failed 13355 1727096164.97007: getting the remaining hosts for this loop 13355 1727096164.97009: done getting the remaining hosts for this loop 13355 1727096164.97012: getting the next task for host managed_node3 13355 1727096164.97017: done getting next task for host managed_node3 13355 1727096164.97021: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096164.97023: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096164.97043: done sending task result for task 0afff68d-5257-c514-593f-00000000002f 13355 1727096164.97050: WORKER PROCESS EXITING 13355 1727096164.97151: getting variables 13355 1727096164.97155: in VariableManager get_vars() 13355 1727096164.97241: Calling all_inventory to load vars for managed_node3 13355 1727096164.97244: Calling groups_inventory to load vars for managed_node3 13355 1727096164.97246: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096164.97258: Calling all_plugins_play to load vars for managed_node3 13355 1727096164.97261: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096164.97265: Calling groups_plugins_play to load vars for managed_node3 13355 1727096164.99065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096165.00992: done with get_vars() 13355 1727096165.01025: done getting variables 13355 1727096165.01103: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:56:05 -0400 (0:00:00.060) 0:00:14.271 ****** 13355 1727096165.01139: entering _queue_task() for managed_node3/package 13355 1727096165.01703: worker is 1 (out of 1 available) 13355 1727096165.01714: exiting _queue_task() for managed_node3/package 13355 1727096165.01727: done queuing things up, now waiting for results queue to drain 13355 1727096165.01728: waiting for pending results... 13355 1727096165.01942: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096165.02085: in run() - task 0afff68d-5257-c514-593f-000000000030 13355 1727096165.02103: variable 'ansible_search_path' from source: unknown 13355 1727096165.02106: variable 'ansible_search_path' from source: unknown 13355 1727096165.02273: calling self._execute() 13355 1727096165.02277: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096165.02281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096165.02284: variable 'omit' from source: magic vars 13355 1727096165.02820: variable 'ansible_distribution_major_version' from source: facts 13355 1727096165.02850: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096165.02983: variable 'network_state' from source: role '' defaults 13355 1727096165.03013: Evaluated conditional (network_state != {}): False 13355 1727096165.03016: when evaluation is False, skipping this task 13355 1727096165.03019: _execute() done 13355 1727096165.03023: dumping result to json 13355 1727096165.03025: done dumping result, returning 13355 1727096165.03029: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-c514-593f-000000000030] 13355 1727096165.03031: sending task result for task 0afff68d-5257-c514-593f-000000000030 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096165.03218: no more pending results, returning what we have 13355 1727096165.03223: results queue empty 13355 1727096165.03224: checking for any_errors_fatal 13355 1727096165.03232: done checking for any_errors_fatal 13355 1727096165.03233: checking for max_fail_percentage 13355 1727096165.03234: done checking for max_fail_percentage 13355 1727096165.03235: checking to see if all hosts have failed and the running result is not ok 13355 1727096165.03236: done checking to see if all hosts have failed 13355 1727096165.03237: getting the remaining hosts for this loop 13355 1727096165.03239: done getting the remaining hosts for this loop 13355 1727096165.03243: getting the next task for host managed_node3 13355 1727096165.03250: done getting next task for host managed_node3 13355 1727096165.03363: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096165.03373: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096165.03398: done sending task result for task 0afff68d-5257-c514-593f-000000000030 13355 1727096165.03403: WORKER PROCESS EXITING 13355 1727096165.03414: getting variables 13355 1727096165.03416: in VariableManager get_vars() 13355 1727096165.03485: Calling all_inventory to load vars for managed_node3 13355 1727096165.03488: Calling groups_inventory to load vars for managed_node3 13355 1727096165.03491: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096165.03505: Calling all_plugins_play to load vars for managed_node3 13355 1727096165.03509: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096165.03512: Calling groups_plugins_play to load vars for managed_node3 13355 1727096165.05156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096165.06938: done with get_vars() 13355 1727096165.06986: done getting variables 13355 1727096165.07132: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:56:05 -0400 (0:00:00.060) 0:00:14.332 ****** 13355 1727096165.07167: entering _queue_task() for managed_node3/service 13355 1727096165.07170: Creating lock for service 13355 1727096165.07609: worker is 1 (out of 1 available) 13355 1727096165.07621: exiting _queue_task() for managed_node3/service 13355 1727096165.07633: done queuing things up, now waiting for results queue to drain 13355 1727096165.07635: waiting for pending results... 13355 1727096165.07890: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096165.08034: in run() - task 0afff68d-5257-c514-593f-000000000031 13355 1727096165.08137: variable 'ansible_search_path' from source: unknown 13355 1727096165.08141: variable 'ansible_search_path' from source: unknown 13355 1727096165.08144: calling self._execute() 13355 1727096165.08273: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096165.08377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096165.08657: variable 'omit' from source: magic vars 13355 1727096165.09367: variable 'ansible_distribution_major_version' from source: facts 13355 1727096165.09451: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096165.09774: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096165.10212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096165.13550: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096165.13644: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096165.13690: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096165.13734: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096165.13774: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096165.13863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096165.13904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096165.13938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.14070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096165.14074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096165.14076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096165.14090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096165.14118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.14163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096165.14192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096165.14241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096165.14273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096165.14309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.14352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096165.14375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096165.14559: variable 'network_connections' from source: task vars 13355 1727096165.14581: variable 'controller_profile' from source: play vars 13355 1727096165.14672: variable 'controller_profile' from source: play vars 13355 1727096165.14675: variable 'controller_device' from source: play vars 13355 1727096165.14818: variable 'controller_device' from source: play vars 13355 1727096165.14873: variable 'port1_profile' from source: play vars 13355 1727096165.14902: variable 'port1_profile' from source: play vars 13355 1727096165.14917: variable 'dhcp_interface1' from source: play vars 13355 1727096165.14989: variable 'dhcp_interface1' from source: play vars 13355 1727096165.15001: variable 'controller_profile' from source: play vars 13355 1727096165.15063: variable 'controller_profile' from source: play vars 13355 1727096165.15078: variable 'port2_profile' from source: play vars 13355 1727096165.15153: variable 'port2_profile' from source: play vars 13355 1727096165.15156: variable 'dhcp_interface2' from source: play vars 13355 1727096165.15213: variable 'dhcp_interface2' from source: play vars 13355 1727096165.15263: variable 'controller_profile' from source: play vars 13355 1727096165.15295: variable 'controller_profile' from source: play vars 13355 1727096165.15544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096165.15866: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096165.15925: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096165.15961: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096165.16003: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096165.16057: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096165.16089: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096165.16125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.16158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096165.16252: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096165.17112: variable 'network_connections' from source: task vars 13355 1727096165.17117: variable 'controller_profile' from source: play vars 13355 1727096165.17156: variable 'controller_profile' from source: play vars 13355 1727096165.17173: variable 'controller_device' from source: play vars 13355 1727096165.17356: variable 'controller_device' from source: play vars 13355 1727096165.17375: variable 'port1_profile' from source: play vars 13355 1727096165.17483: variable 'port1_profile' from source: play vars 13355 1727096165.17496: variable 'dhcp_interface1' from source: play vars 13355 1727096165.17608: variable 'dhcp_interface1' from source: play vars 13355 1727096165.17666: variable 'controller_profile' from source: play vars 13355 1727096165.17874: variable 'controller_profile' from source: play vars 13355 1727096165.17878: variable 'port2_profile' from source: play vars 13355 1727096165.17918: variable 'port2_profile' from source: play vars 13355 1727096165.17986: variable 'dhcp_interface2' from source: play vars 13355 1727096165.18048: variable 'dhcp_interface2' from source: play vars 13355 1727096165.18224: variable 'controller_profile' from source: play vars 13355 1727096165.18289: variable 'controller_profile' from source: play vars 13355 1727096165.18350: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096165.18464: when evaluation is False, skipping this task 13355 1727096165.18474: _execute() done 13355 1727096165.18482: dumping result to json 13355 1727096165.18490: done dumping result, returning 13355 1727096165.18505: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000031] 13355 1727096165.18515: sending task result for task 0afff68d-5257-c514-593f-000000000031 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096165.18688: no more pending results, returning what we have 13355 1727096165.18692: results queue empty 13355 1727096165.18693: checking for any_errors_fatal 13355 1727096165.18700: done checking for any_errors_fatal 13355 1727096165.18701: checking for max_fail_percentage 13355 1727096165.18703: done checking for max_fail_percentage 13355 1727096165.18703: checking to see if all hosts have failed and the running result is not ok 13355 1727096165.18704: done checking to see if all hosts have failed 13355 1727096165.18705: getting the remaining hosts for this loop 13355 1727096165.18706: done getting the remaining hosts for this loop 13355 1727096165.18710: getting the next task for host managed_node3 13355 1727096165.18717: done getting next task for host managed_node3 13355 1727096165.18720: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096165.18723: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096165.18737: getting variables 13355 1727096165.18739: in VariableManager get_vars() 13355 1727096165.18796: Calling all_inventory to load vars for managed_node3 13355 1727096165.18799: Calling groups_inventory to load vars for managed_node3 13355 1727096165.18801: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096165.18812: Calling all_plugins_play to load vars for managed_node3 13355 1727096165.18815: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096165.18818: Calling groups_plugins_play to load vars for managed_node3 13355 1727096165.19604: done sending task result for task 0afff68d-5257-c514-593f-000000000031 13355 1727096165.19609: WORKER PROCESS EXITING 13355 1727096165.20365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096165.22362: done with get_vars() 13355 1727096165.22477: done getting variables 13355 1727096165.22537: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:56:05 -0400 (0:00:00.153) 0:00:14.486 ****** 13355 1727096165.22571: entering _queue_task() for managed_node3/service 13355 1727096165.23130: worker is 1 (out of 1 available) 13355 1727096165.23145: exiting _queue_task() for managed_node3/service 13355 1727096165.23158: done queuing things up, now waiting for results queue to drain 13355 1727096165.23160: waiting for pending results... 13355 1727096165.23687: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096165.23904: in run() - task 0afff68d-5257-c514-593f-000000000032 13355 1727096165.23920: variable 'ansible_search_path' from source: unknown 13355 1727096165.23923: variable 'ansible_search_path' from source: unknown 13355 1727096165.23961: calling self._execute() 13355 1727096165.24056: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096165.24070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096165.24177: variable 'omit' from source: magic vars 13355 1727096165.24549: variable 'ansible_distribution_major_version' from source: facts 13355 1727096165.24569: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096165.24753: variable 'network_provider' from source: set_fact 13355 1727096165.24764: variable 'network_state' from source: role '' defaults 13355 1727096165.24783: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13355 1727096165.24794: variable 'omit' from source: magic vars 13355 1727096165.24863: variable 'omit' from source: magic vars 13355 1727096165.24959: variable 'network_service_name' from source: role '' defaults 13355 1727096165.24982: variable 'network_service_name' from source: role '' defaults 13355 1727096165.25098: variable '__network_provider_setup' from source: role '' defaults 13355 1727096165.25109: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096165.25179: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096165.25194: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096165.25261: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096165.25511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096165.27675: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096165.27679: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096165.27734: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096165.27779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096165.27806: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096165.27889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096165.27924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096165.27997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.28375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096165.28379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096165.28381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096165.28383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096165.28385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.28495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096165.28515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096165.28960: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096165.29099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096165.29134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096165.29163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.29212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096165.29235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096165.29335: variable 'ansible_python' from source: facts 13355 1727096165.29365: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096165.29461: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096165.29549: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096165.29691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096165.29719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096165.29746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.29795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096165.29814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096165.29882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096165.29906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096165.29990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.29993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096165.29995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096165.30134: variable 'network_connections' from source: task vars 13355 1727096165.30147: variable 'controller_profile' from source: play vars 13355 1727096165.30229: variable 'controller_profile' from source: play vars 13355 1727096165.30258: variable 'controller_device' from source: play vars 13355 1727096165.30347: variable 'controller_device' from source: play vars 13355 1727096165.30370: variable 'port1_profile' from source: play vars 13355 1727096165.30498: variable 'port1_profile' from source: play vars 13355 1727096165.30536: variable 'dhcp_interface1' from source: play vars 13355 1727096165.30604: variable 'dhcp_interface1' from source: play vars 13355 1727096165.30644: variable 'controller_profile' from source: play vars 13355 1727096165.30863: variable 'controller_profile' from source: play vars 13355 1727096165.30866: variable 'port2_profile' from source: play vars 13355 1727096165.30871: variable 'port2_profile' from source: play vars 13355 1727096165.30873: variable 'dhcp_interface2' from source: play vars 13355 1727096165.30912: variable 'dhcp_interface2' from source: play vars 13355 1727096165.30930: variable 'controller_profile' from source: play vars 13355 1727096165.31014: variable 'controller_profile' from source: play vars 13355 1727096165.31139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096165.31372: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096165.31430: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096165.31480: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096165.31527: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096165.31598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096165.31636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096165.31676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096165.31713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096165.31777: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096165.32027: variable 'network_connections' from source: task vars 13355 1727096165.32038: variable 'controller_profile' from source: play vars 13355 1727096165.32121: variable 'controller_profile' from source: play vars 13355 1727096165.32199: variable 'controller_device' from source: play vars 13355 1727096165.32281: variable 'controller_device' from source: play vars 13355 1727096165.32305: variable 'port1_profile' from source: play vars 13355 1727096165.32427: variable 'port1_profile' from source: play vars 13355 1727096165.32444: variable 'dhcp_interface1' from source: play vars 13355 1727096165.32637: variable 'dhcp_interface1' from source: play vars 13355 1727096165.32640: variable 'controller_profile' from source: play vars 13355 1727096165.33072: variable 'controller_profile' from source: play vars 13355 1727096165.33076: variable 'port2_profile' from source: play vars 13355 1727096165.33078: variable 'port2_profile' from source: play vars 13355 1727096165.33080: variable 'dhcp_interface2' from source: play vars 13355 1727096165.33083: variable 'dhcp_interface2' from source: play vars 13355 1727096165.33085: variable 'controller_profile' from source: play vars 13355 1727096165.33287: variable 'controller_profile' from source: play vars 13355 1727096165.33655: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096165.33759: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096165.34142: variable 'network_connections' from source: task vars 13355 1727096165.34152: variable 'controller_profile' from source: play vars 13355 1727096165.34229: variable 'controller_profile' from source: play vars 13355 1727096165.34308: variable 'controller_device' from source: play vars 13355 1727096165.34417: variable 'controller_device' from source: play vars 13355 1727096165.34420: variable 'port1_profile' from source: play vars 13355 1727096165.34636: variable 'port1_profile' from source: play vars 13355 1727096165.34639: variable 'dhcp_interface1' from source: play vars 13355 1727096165.34975: variable 'dhcp_interface1' from source: play vars 13355 1727096165.34978: variable 'controller_profile' from source: play vars 13355 1727096165.34980: variable 'controller_profile' from source: play vars 13355 1727096165.34982: variable 'port2_profile' from source: play vars 13355 1727096165.35045: variable 'port2_profile' from source: play vars 13355 1727096165.35057: variable 'dhcp_interface2' from source: play vars 13355 1727096165.35138: variable 'dhcp_interface2' from source: play vars 13355 1727096165.35200: variable 'controller_profile' from source: play vars 13355 1727096165.35409: variable 'controller_profile' from source: play vars 13355 1727096165.35445: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096165.35589: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096165.36000: variable 'network_connections' from source: task vars 13355 1727096165.36010: variable 'controller_profile' from source: play vars 13355 1727096165.36087: variable 'controller_profile' from source: play vars 13355 1727096165.36098: variable 'controller_device' from source: play vars 13355 1727096165.36173: variable 'controller_device' from source: play vars 13355 1727096165.36187: variable 'port1_profile' from source: play vars 13355 1727096165.36257: variable 'port1_profile' from source: play vars 13355 1727096165.36272: variable 'dhcp_interface1' from source: play vars 13355 1727096165.36343: variable 'dhcp_interface1' from source: play vars 13355 1727096165.36354: variable 'controller_profile' from source: play vars 13355 1727096165.36429: variable 'controller_profile' from source: play vars 13355 1727096165.36441: variable 'port2_profile' from source: play vars 13355 1727096165.36574: variable 'port2_profile' from source: play vars 13355 1727096165.36577: variable 'dhcp_interface2' from source: play vars 13355 1727096165.36594: variable 'dhcp_interface2' from source: play vars 13355 1727096165.36609: variable 'controller_profile' from source: play vars 13355 1727096165.36680: variable 'controller_profile' from source: play vars 13355 1727096165.36757: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096165.36830: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096165.36841: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096165.36903: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096165.37133: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096165.37640: variable 'network_connections' from source: task vars 13355 1727096165.37792: variable 'controller_profile' from source: play vars 13355 1727096165.37796: variable 'controller_profile' from source: play vars 13355 1727096165.37798: variable 'controller_device' from source: play vars 13355 1727096165.37886: variable 'controller_device' from source: play vars 13355 1727096165.37904: variable 'port1_profile' from source: play vars 13355 1727096165.37965: variable 'port1_profile' from source: play vars 13355 1727096165.37980: variable 'dhcp_interface1' from source: play vars 13355 1727096165.38118: variable 'dhcp_interface1' from source: play vars 13355 1727096165.38121: variable 'controller_profile' from source: play vars 13355 1727096165.38336: variable 'controller_profile' from source: play vars 13355 1727096165.38339: variable 'port2_profile' from source: play vars 13355 1727096165.38341: variable 'port2_profile' from source: play vars 13355 1727096165.38343: variable 'dhcp_interface2' from source: play vars 13355 1727096165.38421: variable 'dhcp_interface2' from source: play vars 13355 1727096165.38663: variable 'controller_profile' from source: play vars 13355 1727096165.38666: variable 'controller_profile' from source: play vars 13355 1727096165.38672: variable 'ansible_distribution' from source: facts 13355 1727096165.38674: variable '__network_rh_distros' from source: role '' defaults 13355 1727096165.38676: variable 'ansible_distribution_major_version' from source: facts 13355 1727096165.38779: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096165.39077: variable 'ansible_distribution' from source: facts 13355 1727096165.39105: variable '__network_rh_distros' from source: role '' defaults 13355 1727096165.39207: variable 'ansible_distribution_major_version' from source: facts 13355 1727096165.39210: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096165.39499: variable 'ansible_distribution' from source: facts 13355 1727096165.39509: variable '__network_rh_distros' from source: role '' defaults 13355 1727096165.39518: variable 'ansible_distribution_major_version' from source: facts 13355 1727096165.39592: variable 'network_provider' from source: set_fact 13355 1727096165.39629: variable 'omit' from source: magic vars 13355 1727096165.39665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096165.39699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096165.39721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096165.39741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096165.39759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096165.39792: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096165.39800: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096165.39808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096165.39925: Set connection var ansible_shell_executable to /bin/sh 13355 1727096165.39936: Set connection var ansible_shell_type to sh 13355 1727096165.39945: Set connection var ansible_pipelining to False 13355 1727096165.39952: Set connection var ansible_connection to ssh 13355 1727096165.39965: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096165.39977: Set connection var ansible_timeout to 10 13355 1727096165.40005: variable 'ansible_shell_executable' from source: unknown 13355 1727096165.40012: variable 'ansible_connection' from source: unknown 13355 1727096165.40020: variable 'ansible_module_compression' from source: unknown 13355 1727096165.40025: variable 'ansible_shell_type' from source: unknown 13355 1727096165.40031: variable 'ansible_shell_executable' from source: unknown 13355 1727096165.40074: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096165.40077: variable 'ansible_pipelining' from source: unknown 13355 1727096165.40079: variable 'ansible_timeout' from source: unknown 13355 1727096165.40081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096165.40160: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096165.40181: variable 'omit' from source: magic vars 13355 1727096165.40191: starting attempt loop 13355 1727096165.40197: running the handler 13355 1727096165.40283: variable 'ansible_facts' from source: unknown 13355 1727096165.41053: _low_level_execute_command(): starting 13355 1727096165.41066: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096165.41794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096165.41885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096165.41901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096165.41914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096165.41988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096165.43701: stdout chunk (state=3): >>>/root <<< 13355 1727096165.43835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096165.43876: stdout chunk (state=3): >>><<< 13355 1727096165.43880: stderr chunk (state=3): >>><<< 13355 1727096165.43990: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096165.43998: _low_level_execute_command(): starting 13355 1727096165.44002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952 `" && echo ansible-tmp-1727096165.4390254-14040-29819003639952="` echo /root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952 `" ) && sleep 0' 13355 1727096165.44496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096165.44527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096165.44530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096165.44580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096165.44583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096165.44600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096165.44632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096165.46635: stdout chunk (state=3): >>>ansible-tmp-1727096165.4390254-14040-29819003639952=/root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952 <<< 13355 1727096165.46844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096165.46847: stderr chunk (state=3): >>><<< 13355 1727096165.46850: stdout chunk (state=3): >>><<< 13355 1727096165.46872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096165.4390254-14040-29819003639952=/root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096165.46899: variable 'ansible_module_compression' from source: unknown 13355 1727096165.46955: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 13355 1727096165.46959: ANSIBALLZ: Acquiring lock 13355 1727096165.46962: ANSIBALLZ: Lock acquired: 140397099650992 13355 1727096165.46964: ANSIBALLZ: Creating module 13355 1727096165.75061: ANSIBALLZ: Writing module into payload 13355 1727096165.75681: ANSIBALLZ: Writing module 13355 1727096165.75685: ANSIBALLZ: Renaming module 13355 1727096165.75688: ANSIBALLZ: Done creating module 13355 1727096165.75750: variable 'ansible_facts' from source: unknown 13355 1727096165.76150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/AnsiballZ_systemd.py 13355 1727096165.76563: Sending initial data 13355 1727096165.76566: Sent initial data (155 bytes) 13355 1727096165.77794: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096165.77904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096165.77955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096165.77988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096165.79704: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096165.79988: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpmorbvbiv" to remote "/root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/AnsiballZ_systemd.py" <<< 13355 1727096165.79992: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpmorbvbiv /root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/AnsiballZ_systemd.py <<< 13355 1727096165.81972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096165.81976: stderr chunk (state=3): >>><<< 13355 1727096165.81979: stdout chunk (state=3): >>><<< 13355 1727096165.82059: done transferring module to remote 13355 1727096165.82063: _low_level_execute_command(): starting 13355 1727096165.82065: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/ /root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/AnsiballZ_systemd.py && sleep 0' 13355 1727096165.83246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096165.83681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096165.83684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096165.83686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096165.83688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096165.85536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096165.85589: stderr chunk (state=3): >>><<< 13355 1727096165.85686: stdout chunk (state=3): >>><<< 13355 1727096165.85703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096165.85706: _low_level_execute_command(): starting 13355 1727096165.85712: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/AnsiballZ_systemd.py && sleep 0' 13355 1727096165.86916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096165.87085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096165.87096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096165.87110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096165.87122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096165.87129: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096165.87139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096165.87158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096165.87161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096165.87170: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096165.87177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096165.87450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096165.87531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096165.87548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096166.17414: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10395648", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304734720", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "886635000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 13355 1727096166.17426: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd<<< 13355 1727096166.17448: stdout chunk (state=3): >>>-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13355 1727096166.19526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096166.19531: stdout chunk (state=3): >>><<< 13355 1727096166.19533: stderr chunk (state=3): >>><<< 13355 1727096166.19929: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10395648", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3304734720", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "886635000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096166.19941: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096166.19944: _low_level_execute_command(): starting 13355 1727096166.19946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096165.4390254-14040-29819003639952/ > /dev/null 2>&1 && sleep 0' 13355 1727096166.20563: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096166.20688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096166.20699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096166.20746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096166.20782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096166.22720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096166.22776: stdout chunk (state=3): >>><<< 13355 1727096166.22780: stderr chunk (state=3): >>><<< 13355 1727096166.22783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096166.22785: handler run complete 13355 1727096166.22861: attempt loop complete, returning result 13355 1727096166.22876: _execute() done 13355 1727096166.22885: dumping result to json 13355 1727096166.22909: done dumping result, returning 13355 1727096166.22947: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-c514-593f-000000000032] 13355 1727096166.22955: sending task result for task 0afff68d-5257-c514-593f-000000000032 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096166.23524: no more pending results, returning what we have 13355 1727096166.23528: results queue empty 13355 1727096166.23529: checking for any_errors_fatal 13355 1727096166.23536: done checking for any_errors_fatal 13355 1727096166.23537: checking for max_fail_percentage 13355 1727096166.23539: done checking for max_fail_percentage 13355 1727096166.23540: checking to see if all hosts have failed and the running result is not ok 13355 1727096166.23541: done checking to see if all hosts have failed 13355 1727096166.23542: getting the remaining hosts for this loop 13355 1727096166.23543: done getting the remaining hosts for this loop 13355 1727096166.23547: getting the next task for host managed_node3 13355 1727096166.23562: done getting next task for host managed_node3 13355 1727096166.23569: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096166.23572: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096166.23584: getting variables 13355 1727096166.23586: in VariableManager get_vars() 13355 1727096166.23637: Calling all_inventory to load vars for managed_node3 13355 1727096166.23640: Calling groups_inventory to load vars for managed_node3 13355 1727096166.23643: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096166.23776: Calling all_plugins_play to load vars for managed_node3 13355 1727096166.23782: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096166.23789: done sending task result for task 0afff68d-5257-c514-593f-000000000032 13355 1727096166.23793: WORKER PROCESS EXITING 13355 1727096166.23797: Calling groups_plugins_play to load vars for managed_node3 13355 1727096166.24783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096166.25819: done with get_vars() 13355 1727096166.25841: done getting variables 13355 1727096166.25890: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:56:06 -0400 (0:00:01.033) 0:00:15.519 ****** 13355 1727096166.25914: entering _queue_task() for managed_node3/service 13355 1727096166.26179: worker is 1 (out of 1 available) 13355 1727096166.26193: exiting _queue_task() for managed_node3/service 13355 1727096166.26206: done queuing things up, now waiting for results queue to drain 13355 1727096166.26207: waiting for pending results... 13355 1727096166.26383: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096166.26515: in run() - task 0afff68d-5257-c514-593f-000000000033 13355 1727096166.26531: variable 'ansible_search_path' from source: unknown 13355 1727096166.26535: variable 'ansible_search_path' from source: unknown 13355 1727096166.26576: calling self._execute() 13355 1727096166.26668: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096166.26673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096166.26681: variable 'omit' from source: magic vars 13355 1727096166.27059: variable 'ansible_distribution_major_version' from source: facts 13355 1727096166.27063: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096166.27142: variable 'network_provider' from source: set_fact 13355 1727096166.27146: Evaluated conditional (network_provider == "nm"): True 13355 1727096166.27226: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096166.27299: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096166.27430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096166.29458: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096166.29502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096166.29547: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096166.29571: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096166.29591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096166.29655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096166.29706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096166.29710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096166.29756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096166.29772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096166.29804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096166.29820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096166.29837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096166.29863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096166.29878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096166.29910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096166.29925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096166.29941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096166.29966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096166.29981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096166.30093: variable 'network_connections' from source: task vars 13355 1727096166.30106: variable 'controller_profile' from source: play vars 13355 1727096166.30158: variable 'controller_profile' from source: play vars 13355 1727096166.30164: variable 'controller_device' from source: play vars 13355 1727096166.30216: variable 'controller_device' from source: play vars 13355 1727096166.30219: variable 'port1_profile' from source: play vars 13355 1727096166.30262: variable 'port1_profile' from source: play vars 13355 1727096166.30270: variable 'dhcp_interface1' from source: play vars 13355 1727096166.30332: variable 'dhcp_interface1' from source: play vars 13355 1727096166.30335: variable 'controller_profile' from source: play vars 13355 1727096166.30385: variable 'controller_profile' from source: play vars 13355 1727096166.30388: variable 'port2_profile' from source: play vars 13355 1727096166.30428: variable 'port2_profile' from source: play vars 13355 1727096166.30433: variable 'dhcp_interface2' from source: play vars 13355 1727096166.30509: variable 'dhcp_interface2' from source: play vars 13355 1727096166.30513: variable 'controller_profile' from source: play vars 13355 1727096166.30572: variable 'controller_profile' from source: play vars 13355 1727096166.30642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096166.30818: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096166.30859: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096166.30920: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096166.30923: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096166.30981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096166.30984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096166.31002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096166.31025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096166.31071: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096166.31323: variable 'network_connections' from source: task vars 13355 1727096166.31326: variable 'controller_profile' from source: play vars 13355 1727096166.31356: variable 'controller_profile' from source: play vars 13355 1727096166.31414: variable 'controller_device' from source: play vars 13355 1727096166.31514: variable 'controller_device' from source: play vars 13355 1727096166.31517: variable 'port1_profile' from source: play vars 13355 1727096166.31525: variable 'port1_profile' from source: play vars 13355 1727096166.31528: variable 'dhcp_interface1' from source: play vars 13355 1727096166.31575: variable 'dhcp_interface1' from source: play vars 13355 1727096166.31584: variable 'controller_profile' from source: play vars 13355 1727096166.31636: variable 'controller_profile' from source: play vars 13355 1727096166.31639: variable 'port2_profile' from source: play vars 13355 1727096166.31686: variable 'port2_profile' from source: play vars 13355 1727096166.31692: variable 'dhcp_interface2' from source: play vars 13355 1727096166.31732: variable 'dhcp_interface2' from source: play vars 13355 1727096166.31744: variable 'controller_profile' from source: play vars 13355 1727096166.31785: variable 'controller_profile' from source: play vars 13355 1727096166.31840: Evaluated conditional (__network_wpa_supplicant_required): False 13355 1727096166.31844: when evaluation is False, skipping this task 13355 1727096166.31847: _execute() done 13355 1727096166.31851: dumping result to json 13355 1727096166.31856: done dumping result, returning 13355 1727096166.31858: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-c514-593f-000000000033] 13355 1727096166.31861: sending task result for task 0afff68d-5257-c514-593f-000000000033 13355 1727096166.31964: done sending task result for task 0afff68d-5257-c514-593f-000000000033 13355 1727096166.31968: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13355 1727096166.32058: no more pending results, returning what we have 13355 1727096166.32062: results queue empty 13355 1727096166.32063: checking for any_errors_fatal 13355 1727096166.32084: done checking for any_errors_fatal 13355 1727096166.32086: checking for max_fail_percentage 13355 1727096166.32087: done checking for max_fail_percentage 13355 1727096166.32088: checking to see if all hosts have failed and the running result is not ok 13355 1727096166.32089: done checking to see if all hosts have failed 13355 1727096166.32090: getting the remaining hosts for this loop 13355 1727096166.32091: done getting the remaining hosts for this loop 13355 1727096166.32094: getting the next task for host managed_node3 13355 1727096166.32102: done getting next task for host managed_node3 13355 1727096166.32106: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096166.32108: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096166.32122: getting variables 13355 1727096166.32123: in VariableManager get_vars() 13355 1727096166.32216: Calling all_inventory to load vars for managed_node3 13355 1727096166.32219: Calling groups_inventory to load vars for managed_node3 13355 1727096166.32221: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096166.32230: Calling all_plugins_play to load vars for managed_node3 13355 1727096166.32233: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096166.32235: Calling groups_plugins_play to load vars for managed_node3 13355 1727096166.33410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096166.35149: done with get_vars() 13355 1727096166.35260: done getting variables 13355 1727096166.35477: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:56:06 -0400 (0:00:00.095) 0:00:15.615 ****** 13355 1727096166.35515: entering _queue_task() for managed_node3/service 13355 1727096166.36033: worker is 1 (out of 1 available) 13355 1727096166.36046: exiting _queue_task() for managed_node3/service 13355 1727096166.36061: done queuing things up, now waiting for results queue to drain 13355 1727096166.36062: waiting for pending results... 13355 1727096166.36299: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096166.36399: in run() - task 0afff68d-5257-c514-593f-000000000034 13355 1727096166.36411: variable 'ansible_search_path' from source: unknown 13355 1727096166.36417: variable 'ansible_search_path' from source: unknown 13355 1727096166.36459: calling self._execute() 13355 1727096166.36536: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096166.36541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096166.36550: variable 'omit' from source: magic vars 13355 1727096166.36875: variable 'ansible_distribution_major_version' from source: facts 13355 1727096166.36884: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096166.36969: variable 'network_provider' from source: set_fact 13355 1727096166.36973: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096166.36976: when evaluation is False, skipping this task 13355 1727096166.36979: _execute() done 13355 1727096166.36981: dumping result to json 13355 1727096166.36984: done dumping result, returning 13355 1727096166.36991: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-c514-593f-000000000034] 13355 1727096166.36996: sending task result for task 0afff68d-5257-c514-593f-000000000034 13355 1727096166.37081: done sending task result for task 0afff68d-5257-c514-593f-000000000034 13355 1727096166.37084: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096166.37132: no more pending results, returning what we have 13355 1727096166.37136: results queue empty 13355 1727096166.37136: checking for any_errors_fatal 13355 1727096166.37146: done checking for any_errors_fatal 13355 1727096166.37147: checking for max_fail_percentage 13355 1727096166.37148: done checking for max_fail_percentage 13355 1727096166.37149: checking to see if all hosts have failed and the running result is not ok 13355 1727096166.37152: done checking to see if all hosts have failed 13355 1727096166.37155: getting the remaining hosts for this loop 13355 1727096166.37156: done getting the remaining hosts for this loop 13355 1727096166.37160: getting the next task for host managed_node3 13355 1727096166.37166: done getting next task for host managed_node3 13355 1727096166.37173: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096166.37175: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096166.37194: getting variables 13355 1727096166.37195: in VariableManager get_vars() 13355 1727096166.37242: Calling all_inventory to load vars for managed_node3 13355 1727096166.37244: Calling groups_inventory to load vars for managed_node3 13355 1727096166.37246: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096166.37254: Calling all_plugins_play to load vars for managed_node3 13355 1727096166.37257: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096166.37259: Calling groups_plugins_play to load vars for managed_node3 13355 1727096166.38723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096166.41697: done with get_vars() 13355 1727096166.41727: done getting variables 13355 1727096166.41800: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:56:06 -0400 (0:00:00.063) 0:00:15.678 ****** 13355 1727096166.41834: entering _queue_task() for managed_node3/copy 13355 1727096166.42431: worker is 1 (out of 1 available) 13355 1727096166.42447: exiting _queue_task() for managed_node3/copy 13355 1727096166.42464: done queuing things up, now waiting for results queue to drain 13355 1727096166.42466: waiting for pending results... 13355 1727096166.42886: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096166.42983: in run() - task 0afff68d-5257-c514-593f-000000000035 13355 1727096166.42990: variable 'ansible_search_path' from source: unknown 13355 1727096166.42993: variable 'ansible_search_path' from source: unknown 13355 1727096166.43019: calling self._execute() 13355 1727096166.43115: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096166.43127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096166.43141: variable 'omit' from source: magic vars 13355 1727096166.43633: variable 'ansible_distribution_major_version' from source: facts 13355 1727096166.43637: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096166.43720: variable 'network_provider' from source: set_fact 13355 1727096166.43736: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096166.43756: when evaluation is False, skipping this task 13355 1727096166.43766: _execute() done 13355 1727096166.43780: dumping result to json 13355 1727096166.43797: done dumping result, returning 13355 1727096166.43817: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-c514-593f-000000000035] 13355 1727096166.43861: sending task result for task 0afff68d-5257-c514-593f-000000000035 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096166.44064: no more pending results, returning what we have 13355 1727096166.44072: results queue empty 13355 1727096166.44073: checking for any_errors_fatal 13355 1727096166.44080: done checking for any_errors_fatal 13355 1727096166.44081: checking for max_fail_percentage 13355 1727096166.44086: done checking for max_fail_percentage 13355 1727096166.44087: checking to see if all hosts have failed and the running result is not ok 13355 1727096166.44088: done checking to see if all hosts have failed 13355 1727096166.44088: getting the remaining hosts for this loop 13355 1727096166.44091: done getting the remaining hosts for this loop 13355 1727096166.44095: getting the next task for host managed_node3 13355 1727096166.44102: done getting next task for host managed_node3 13355 1727096166.44110: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096166.44113: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096166.44130: getting variables 13355 1727096166.44132: in VariableManager get_vars() 13355 1727096166.44402: Calling all_inventory to load vars for managed_node3 13355 1727096166.44405: Calling groups_inventory to load vars for managed_node3 13355 1727096166.44408: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096166.44414: done sending task result for task 0afff68d-5257-c514-593f-000000000035 13355 1727096166.44417: WORKER PROCESS EXITING 13355 1727096166.44426: Calling all_plugins_play to load vars for managed_node3 13355 1727096166.44429: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096166.44432: Calling groups_plugins_play to load vars for managed_node3 13355 1727096166.45841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096166.47746: done with get_vars() 13355 1727096166.47886: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:56:06 -0400 (0:00:00.061) 0:00:15.740 ****** 13355 1727096166.47981: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096166.47983: Creating lock for fedora.linux_system_roles.network_connections 13355 1727096166.48787: worker is 1 (out of 1 available) 13355 1727096166.48802: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096166.48818: done queuing things up, now waiting for results queue to drain 13355 1727096166.48819: waiting for pending results... 13355 1727096166.49140: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096166.49290: in run() - task 0afff68d-5257-c514-593f-000000000036 13355 1727096166.49314: variable 'ansible_search_path' from source: unknown 13355 1727096166.49321: variable 'ansible_search_path' from source: unknown 13355 1727096166.49406: calling self._execute() 13355 1727096166.49463: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096166.49478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096166.49490: variable 'omit' from source: magic vars 13355 1727096166.49960: variable 'ansible_distribution_major_version' from source: facts 13355 1727096166.49981: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096166.49992: variable 'omit' from source: magic vars 13355 1727096166.50050: variable 'omit' from source: magic vars 13355 1727096166.50275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096166.52906: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096166.52978: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096166.53109: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096166.53336: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096166.53339: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096166.53490: variable 'network_provider' from source: set_fact 13355 1727096166.53737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096166.53974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096166.53978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096166.53981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096166.54173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096166.54193: variable 'omit' from source: magic vars 13355 1727096166.54640: variable 'omit' from source: magic vars 13355 1727096166.54859: variable 'network_connections' from source: task vars 13355 1727096166.54862: variable 'controller_profile' from source: play vars 13355 1727096166.54865: variable 'controller_profile' from source: play vars 13355 1727096166.54869: variable 'controller_device' from source: play vars 13355 1727096166.55019: variable 'controller_device' from source: play vars 13355 1727096166.55036: variable 'port1_profile' from source: play vars 13355 1727096166.55207: variable 'port1_profile' from source: play vars 13355 1727096166.55219: variable 'dhcp_interface1' from source: play vars 13355 1727096166.55283: variable 'dhcp_interface1' from source: play vars 13355 1727096166.55473: variable 'controller_profile' from source: play vars 13355 1727096166.55476: variable 'controller_profile' from source: play vars 13355 1727096166.55486: variable 'port2_profile' from source: play vars 13355 1727096166.55610: variable 'port2_profile' from source: play vars 13355 1727096166.55674: variable 'dhcp_interface2' from source: play vars 13355 1727096166.55735: variable 'dhcp_interface2' from source: play vars 13355 1727096166.55745: variable 'controller_profile' from source: play vars 13355 1727096166.55972: variable 'controller_profile' from source: play vars 13355 1727096166.56318: variable 'omit' from source: magic vars 13355 1727096166.56333: variable '__lsr_ansible_managed' from source: task vars 13355 1727096166.56573: variable '__lsr_ansible_managed' from source: task vars 13355 1727096166.56844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13355 1727096166.58044: Loaded config def from plugin (lookup/template) 13355 1727096166.58056: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13355 1727096166.58091: File lookup term: get_ansible_managed.j2 13355 1727096166.58132: variable 'ansible_search_path' from source: unknown 13355 1727096166.58142: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13355 1727096166.58163: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13355 1727096166.58344: variable 'ansible_search_path' from source: unknown 13355 1727096166.70775: variable 'ansible_managed' from source: unknown 13355 1727096166.71044: variable 'omit' from source: magic vars 13355 1727096166.71106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096166.71212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096166.71346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096166.71350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096166.71354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096166.71417: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096166.71427: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096166.71437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096166.71674: Set connection var ansible_shell_executable to /bin/sh 13355 1727096166.71677: Set connection var ansible_shell_type to sh 13355 1727096166.71680: Set connection var ansible_pipelining to False 13355 1727096166.71782: Set connection var ansible_connection to ssh 13355 1727096166.71785: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096166.71795: Set connection var ansible_timeout to 10 13355 1727096166.71797: variable 'ansible_shell_executable' from source: unknown 13355 1727096166.71800: variable 'ansible_connection' from source: unknown 13355 1727096166.71841: variable 'ansible_module_compression' from source: unknown 13355 1727096166.71851: variable 'ansible_shell_type' from source: unknown 13355 1727096166.71862: variable 'ansible_shell_executable' from source: unknown 13355 1727096166.71871: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096166.71880: variable 'ansible_pipelining' from source: unknown 13355 1727096166.71972: variable 'ansible_timeout' from source: unknown 13355 1727096166.71975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096166.72221: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096166.72325: variable 'omit' from source: magic vars 13355 1727096166.72328: starting attempt loop 13355 1727096166.72331: running the handler 13355 1727096166.72333: _low_level_execute_command(): starting 13355 1727096166.72335: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096166.74247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096166.74293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096166.74336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096166.74733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096166.74800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096166.76517: stdout chunk (state=3): >>>/root <<< 13355 1727096166.76675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096166.76704: stderr chunk (state=3): >>><<< 13355 1727096166.76972: stdout chunk (state=3): >>><<< 13355 1727096166.76976: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096166.76979: _low_level_execute_command(): starting 13355 1727096166.76982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400 `" && echo ansible-tmp-1727096166.7678623-14102-141196210113400="` echo /root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400 `" ) && sleep 0' 13355 1727096166.78290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096166.78361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096166.78506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096166.78525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096166.78621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096166.80704: stdout chunk (state=3): >>>ansible-tmp-1727096166.7678623-14102-141196210113400=/root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400 <<< 13355 1727096166.80911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096166.80922: stdout chunk (state=3): >>><<< 13355 1727096166.80935: stderr chunk (state=3): >>><<< 13355 1727096166.80996: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096166.7678623-14102-141196210113400=/root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096166.81051: variable 'ansible_module_compression' from source: unknown 13355 1727096166.81275: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 13355 1727096166.81283: ANSIBALLZ: Acquiring lock 13355 1727096166.81286: ANSIBALLZ: Lock acquired: 140397095761344 13355 1727096166.81288: ANSIBALLZ: Creating module 13355 1727096167.13319: ANSIBALLZ: Writing module into payload 13355 1727096167.13674: ANSIBALLZ: Writing module 13355 1727096167.13705: ANSIBALLZ: Renaming module 13355 1727096167.13716: ANSIBALLZ: Done creating module 13355 1727096167.13746: variable 'ansible_facts' from source: unknown 13355 1727096167.13893: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/AnsiballZ_network_connections.py 13355 1727096167.14106: Sending initial data 13355 1727096167.14109: Sent initial data (168 bytes) 13355 1727096167.14757: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096167.14764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096167.14779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096167.14844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096167.14847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096167.14857: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096167.14907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096167.14920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096167.14943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096167.15014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096167.16947: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096167.16952: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096167.17174: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpl25dlf6k /root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/AnsiballZ_network_connections.py <<< 13355 1727096167.17180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/AnsiballZ_network_connections.py" <<< 13355 1727096167.17182: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13355 1727096167.17185: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpl25dlf6k" to remote "/root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/AnsiballZ_network_connections.py" <<< 13355 1727096167.18697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096167.18975: stderr chunk (state=3): >>><<< 13355 1727096167.19000: stdout chunk (state=3): >>><<< 13355 1727096167.19160: done transferring module to remote 13355 1727096167.19201: _low_level_execute_command(): starting 13355 1727096167.19207: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/ /root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/AnsiballZ_network_connections.py && sleep 0' 13355 1727096167.19933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096167.19947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096167.19963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096167.19984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096167.20001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096167.20012: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096167.20025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096167.20051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096167.20157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096167.20181: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096167.20224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096167.22088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096167.22142: stderr chunk (state=3): >>><<< 13355 1727096167.22155: stdout chunk (state=3): >>><<< 13355 1727096167.22183: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096167.22192: _low_level_execute_command(): starting 13355 1727096167.22207: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/AnsiballZ_network_connections.py && sleep 0' 13355 1727096167.22845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096167.22863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096167.22889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096167.22999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096167.23040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096167.23158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096167.65593: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13355 1727096167.67675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096167.67679: stdout chunk (state=3): >>><<< 13355 1727096167.67682: stderr chunk (state=3): >>><<< 13355 1727096167.67684: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096167.67686: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096167.67693: _low_level_execute_command(): starting 13355 1727096167.67696: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096166.7678623-14102-141196210113400/ > /dev/null 2>&1 && sleep 0' 13355 1727096167.68425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096167.68433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096167.68452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096167.68483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096167.68495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096167.68601: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096167.68618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096167.68688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096167.70644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096167.70693: stderr chunk (state=3): >>><<< 13355 1727096167.70696: stdout chunk (state=3): >>><<< 13355 1727096167.70712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096167.70717: handler run complete 13355 1727096167.70743: attempt loop complete, returning result 13355 1727096167.70746: _execute() done 13355 1727096167.70748: dumping result to json 13355 1727096167.70757: done dumping result, returning 13355 1727096167.70764: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-c514-593f-000000000036] 13355 1727096167.70768: sending task result for task 0afff68d-5257-c514-593f-000000000036 13355 1727096167.70888: done sending task result for task 0afff68d-5257-c514-593f-000000000036 13355 1727096167.70890: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active) 13355 1727096167.71027: no more pending results, returning what we have 13355 1727096167.71030: results queue empty 13355 1727096167.71031: checking for any_errors_fatal 13355 1727096167.71041: done checking for any_errors_fatal 13355 1727096167.71042: checking for max_fail_percentage 13355 1727096167.71044: done checking for max_fail_percentage 13355 1727096167.71045: checking to see if all hosts have failed and the running result is not ok 13355 1727096167.71045: done checking to see if all hosts have failed 13355 1727096167.71046: getting the remaining hosts for this loop 13355 1727096167.71047: done getting the remaining hosts for this loop 13355 1727096167.71051: getting the next task for host managed_node3 13355 1727096167.71061: done getting next task for host managed_node3 13355 1727096167.71065: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096167.71076: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096167.71086: getting variables 13355 1727096167.71088: in VariableManager get_vars() 13355 1727096167.71132: Calling all_inventory to load vars for managed_node3 13355 1727096167.71134: Calling groups_inventory to load vars for managed_node3 13355 1727096167.71137: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096167.71145: Calling all_plugins_play to load vars for managed_node3 13355 1727096167.71148: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096167.71150: Calling groups_plugins_play to load vars for managed_node3 13355 1727096167.72655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096167.73517: done with get_vars() 13355 1727096167.73537: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:56:07 -0400 (0:00:01.256) 0:00:16.996 ****** 13355 1727096167.73606: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096167.73608: Creating lock for fedora.linux_system_roles.network_state 13355 1727096167.73876: worker is 1 (out of 1 available) 13355 1727096167.73889: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096167.73903: done queuing things up, now waiting for results queue to drain 13355 1727096167.73905: waiting for pending results... 13355 1727096167.74085: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096167.74176: in run() - task 0afff68d-5257-c514-593f-000000000037 13355 1727096167.74189: variable 'ansible_search_path' from source: unknown 13355 1727096167.74192: variable 'ansible_search_path' from source: unknown 13355 1727096167.74220: calling self._execute() 13355 1727096167.74295: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.74301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.74310: variable 'omit' from source: magic vars 13355 1727096167.74599: variable 'ansible_distribution_major_version' from source: facts 13355 1727096167.74608: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096167.74696: variable 'network_state' from source: role '' defaults 13355 1727096167.74705: Evaluated conditional (network_state != {}): False 13355 1727096167.74711: when evaluation is False, skipping this task 13355 1727096167.74714: _execute() done 13355 1727096167.74716: dumping result to json 13355 1727096167.74719: done dumping result, returning 13355 1727096167.74727: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-c514-593f-000000000037] 13355 1727096167.74731: sending task result for task 0afff68d-5257-c514-593f-000000000037 13355 1727096167.74819: done sending task result for task 0afff68d-5257-c514-593f-000000000037 13355 1727096167.74822: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096167.74878: no more pending results, returning what we have 13355 1727096167.74882: results queue empty 13355 1727096167.74883: checking for any_errors_fatal 13355 1727096167.74901: done checking for any_errors_fatal 13355 1727096167.74901: checking for max_fail_percentage 13355 1727096167.74903: done checking for max_fail_percentage 13355 1727096167.74904: checking to see if all hosts have failed and the running result is not ok 13355 1727096167.74905: done checking to see if all hosts have failed 13355 1727096167.74905: getting the remaining hosts for this loop 13355 1727096167.74907: done getting the remaining hosts for this loop 13355 1727096167.74910: getting the next task for host managed_node3 13355 1727096167.74916: done getting next task for host managed_node3 13355 1727096167.74920: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096167.74923: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096167.74948: getting variables 13355 1727096167.74949: in VariableManager get_vars() 13355 1727096167.74994: Calling all_inventory to load vars for managed_node3 13355 1727096167.74996: Calling groups_inventory to load vars for managed_node3 13355 1727096167.74998: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096167.75007: Calling all_plugins_play to load vars for managed_node3 13355 1727096167.75010: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096167.75012: Calling groups_plugins_play to load vars for managed_node3 13355 1727096167.76069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096167.77396: done with get_vars() 13355 1727096167.77419: done getting variables 13355 1727096167.77469: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:56:07 -0400 (0:00:00.038) 0:00:17.035 ****** 13355 1727096167.77506: entering _queue_task() for managed_node3/debug 13355 1727096167.77764: worker is 1 (out of 1 available) 13355 1727096167.77779: exiting _queue_task() for managed_node3/debug 13355 1727096167.77791: done queuing things up, now waiting for results queue to drain 13355 1727096167.77792: waiting for pending results... 13355 1727096167.77972: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096167.78063: in run() - task 0afff68d-5257-c514-593f-000000000038 13355 1727096167.78076: variable 'ansible_search_path' from source: unknown 13355 1727096167.78079: variable 'ansible_search_path' from source: unknown 13355 1727096167.78109: calling self._execute() 13355 1727096167.78183: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.78187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.78195: variable 'omit' from source: magic vars 13355 1727096167.78476: variable 'ansible_distribution_major_version' from source: facts 13355 1727096167.78486: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096167.78491: variable 'omit' from source: magic vars 13355 1727096167.78527: variable 'omit' from source: magic vars 13355 1727096167.78553: variable 'omit' from source: magic vars 13355 1727096167.78591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096167.78617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096167.78632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096167.78646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096167.78658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096167.78687: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096167.78690: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.78692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.78760: Set connection var ansible_shell_executable to /bin/sh 13355 1727096167.78764: Set connection var ansible_shell_type to sh 13355 1727096167.78771: Set connection var ansible_pipelining to False 13355 1727096167.78777: Set connection var ansible_connection to ssh 13355 1727096167.78790: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096167.78793: Set connection var ansible_timeout to 10 13355 1727096167.78807: variable 'ansible_shell_executable' from source: unknown 13355 1727096167.78809: variable 'ansible_connection' from source: unknown 13355 1727096167.78812: variable 'ansible_module_compression' from source: unknown 13355 1727096167.78814: variable 'ansible_shell_type' from source: unknown 13355 1727096167.78816: variable 'ansible_shell_executable' from source: unknown 13355 1727096167.78821: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.78823: variable 'ansible_pipelining' from source: unknown 13355 1727096167.78826: variable 'ansible_timeout' from source: unknown 13355 1727096167.78830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.78937: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096167.78947: variable 'omit' from source: magic vars 13355 1727096167.78952: starting attempt loop 13355 1727096167.78958: running the handler 13355 1727096167.79054: variable '__network_connections_result' from source: set_fact 13355 1727096167.79106: handler run complete 13355 1727096167.79120: attempt loop complete, returning result 13355 1727096167.79123: _execute() done 13355 1727096167.79126: dumping result to json 13355 1727096167.79129: done dumping result, returning 13355 1727096167.79137: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-c514-593f-000000000038] 13355 1727096167.79143: sending task result for task 0afff68d-5257-c514-593f-000000000038 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)" ] } 13355 1727096167.79294: no more pending results, returning what we have 13355 1727096167.79297: results queue empty 13355 1727096167.79297: checking for any_errors_fatal 13355 1727096167.79304: done checking for any_errors_fatal 13355 1727096167.79305: checking for max_fail_percentage 13355 1727096167.79307: done checking for max_fail_percentage 13355 1727096167.79308: checking to see if all hosts have failed and the running result is not ok 13355 1727096167.79308: done checking to see if all hosts have failed 13355 1727096167.79309: getting the remaining hosts for this loop 13355 1727096167.79310: done getting the remaining hosts for this loop 13355 1727096167.79313: getting the next task for host managed_node3 13355 1727096167.79319: done getting next task for host managed_node3 13355 1727096167.79323: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096167.79326: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096167.79338: getting variables 13355 1727096167.79339: in VariableManager get_vars() 13355 1727096167.79387: Calling all_inventory to load vars for managed_node3 13355 1727096167.79390: Calling groups_inventory to load vars for managed_node3 13355 1727096167.79392: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096167.79401: Calling all_plugins_play to load vars for managed_node3 13355 1727096167.79403: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096167.79405: Calling groups_plugins_play to load vars for managed_node3 13355 1727096167.79981: done sending task result for task 0afff68d-5257-c514-593f-000000000038 13355 1727096167.79985: WORKER PROCESS EXITING 13355 1727096167.80889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096167.82204: done with get_vars() 13355 1727096167.82233: done getting variables 13355 1727096167.82296: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:56:07 -0400 (0:00:00.048) 0:00:17.083 ****** 13355 1727096167.82335: entering _queue_task() for managed_node3/debug 13355 1727096167.82638: worker is 1 (out of 1 available) 13355 1727096167.82652: exiting _queue_task() for managed_node3/debug 13355 1727096167.82664: done queuing things up, now waiting for results queue to drain 13355 1727096167.82665: waiting for pending results... 13355 1727096167.82850: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096167.82951: in run() - task 0afff68d-5257-c514-593f-000000000039 13355 1727096167.82971: variable 'ansible_search_path' from source: unknown 13355 1727096167.82975: variable 'ansible_search_path' from source: unknown 13355 1727096167.83006: calling self._execute() 13355 1727096167.83077: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.83082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.83092: variable 'omit' from source: magic vars 13355 1727096167.83384: variable 'ansible_distribution_major_version' from source: facts 13355 1727096167.83394: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096167.83400: variable 'omit' from source: magic vars 13355 1727096167.83438: variable 'omit' from source: magic vars 13355 1727096167.83469: variable 'omit' from source: magic vars 13355 1727096167.83500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096167.83528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096167.83545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096167.83561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096167.83572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096167.83595: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096167.83599: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.83601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.83675: Set connection var ansible_shell_executable to /bin/sh 13355 1727096167.83681: Set connection var ansible_shell_type to sh 13355 1727096167.83686: Set connection var ansible_pipelining to False 13355 1727096167.83690: Set connection var ansible_connection to ssh 13355 1727096167.83696: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096167.83700: Set connection var ansible_timeout to 10 13355 1727096167.83718: variable 'ansible_shell_executable' from source: unknown 13355 1727096167.83720: variable 'ansible_connection' from source: unknown 13355 1727096167.83723: variable 'ansible_module_compression' from source: unknown 13355 1727096167.83725: variable 'ansible_shell_type' from source: unknown 13355 1727096167.83728: variable 'ansible_shell_executable' from source: unknown 13355 1727096167.83730: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.83733: variable 'ansible_pipelining' from source: unknown 13355 1727096167.83736: variable 'ansible_timeout' from source: unknown 13355 1727096167.83740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.83840: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096167.83865: variable 'omit' from source: magic vars 13355 1727096167.83869: starting attempt loop 13355 1727096167.83872: running the handler 13355 1727096167.83911: variable '__network_connections_result' from source: set_fact 13355 1727096167.83966: variable '__network_connections_result' from source: set_fact 13355 1727096167.84077: handler run complete 13355 1727096167.84097: attempt loop complete, returning result 13355 1727096167.84100: _execute() done 13355 1727096167.84103: dumping result to json 13355 1727096167.84108: done dumping result, returning 13355 1727096167.84116: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-c514-593f-000000000039] 13355 1727096167.84121: sending task result for task 0afff68d-5257-c514-593f-000000000039 13355 1727096167.84214: done sending task result for task 0afff68d-5257-c514-593f-000000000039 13355 1727096167.84217: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 0719f6e4-c7c1-4297-9228-210ef46db39c (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)" ] } } 13355 1727096167.84315: no more pending results, returning what we have 13355 1727096167.84318: results queue empty 13355 1727096167.84333: checking for any_errors_fatal 13355 1727096167.84339: done checking for any_errors_fatal 13355 1727096167.84340: checking for max_fail_percentage 13355 1727096167.84341: done checking for max_fail_percentage 13355 1727096167.84342: checking to see if all hosts have failed and the running result is not ok 13355 1727096167.84343: done checking to see if all hosts have failed 13355 1727096167.84343: getting the remaining hosts for this loop 13355 1727096167.84345: done getting the remaining hosts for this loop 13355 1727096167.84348: getting the next task for host managed_node3 13355 1727096167.84353: done getting next task for host managed_node3 13355 1727096167.84357: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096167.84359: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096167.84371: getting variables 13355 1727096167.84372: in VariableManager get_vars() 13355 1727096167.84412: Calling all_inventory to load vars for managed_node3 13355 1727096167.84415: Calling groups_inventory to load vars for managed_node3 13355 1727096167.84417: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096167.84425: Calling all_plugins_play to load vars for managed_node3 13355 1727096167.84427: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096167.84429: Calling groups_plugins_play to load vars for managed_node3 13355 1727096167.85303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096167.86537: done with get_vars() 13355 1727096167.86561: done getting variables 13355 1727096167.86608: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:56:07 -0400 (0:00:00.043) 0:00:17.126 ****** 13355 1727096167.86637: entering _queue_task() for managed_node3/debug 13355 1727096167.86906: worker is 1 (out of 1 available) 13355 1727096167.86919: exiting _queue_task() for managed_node3/debug 13355 1727096167.86932: done queuing things up, now waiting for results queue to drain 13355 1727096167.86934: waiting for pending results... 13355 1727096167.87124: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096167.87225: in run() - task 0afff68d-5257-c514-593f-00000000003a 13355 1727096167.87237: variable 'ansible_search_path' from source: unknown 13355 1727096167.87241: variable 'ansible_search_path' from source: unknown 13355 1727096167.87272: calling self._execute() 13355 1727096167.87344: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.87349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.87358: variable 'omit' from source: magic vars 13355 1727096167.87643: variable 'ansible_distribution_major_version' from source: facts 13355 1727096167.87655: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096167.87742: variable 'network_state' from source: role '' defaults 13355 1727096167.87751: Evaluated conditional (network_state != {}): False 13355 1727096167.87757: when evaluation is False, skipping this task 13355 1727096167.87760: _execute() done 13355 1727096167.87762: dumping result to json 13355 1727096167.87764: done dumping result, returning 13355 1727096167.87772: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-c514-593f-00000000003a] 13355 1727096167.87777: sending task result for task 0afff68d-5257-c514-593f-00000000003a 13355 1727096167.87864: done sending task result for task 0afff68d-5257-c514-593f-00000000003a 13355 1727096167.87866: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13355 1727096167.87915: no more pending results, returning what we have 13355 1727096167.87918: results queue empty 13355 1727096167.87919: checking for any_errors_fatal 13355 1727096167.87930: done checking for any_errors_fatal 13355 1727096167.87930: checking for max_fail_percentage 13355 1727096167.87932: done checking for max_fail_percentage 13355 1727096167.87933: checking to see if all hosts have failed and the running result is not ok 13355 1727096167.87934: done checking to see if all hosts have failed 13355 1727096167.87934: getting the remaining hosts for this loop 13355 1727096167.87935: done getting the remaining hosts for this loop 13355 1727096167.87939: getting the next task for host managed_node3 13355 1727096167.87944: done getting next task for host managed_node3 13355 1727096167.87950: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096167.87955: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096167.87972: getting variables 13355 1727096167.87973: in VariableManager get_vars() 13355 1727096167.88020: Calling all_inventory to load vars for managed_node3 13355 1727096167.88023: Calling groups_inventory to load vars for managed_node3 13355 1727096167.88024: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096167.88033: Calling all_plugins_play to load vars for managed_node3 13355 1727096167.88035: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096167.88037: Calling groups_plugins_play to load vars for managed_node3 13355 1727096167.88907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096167.89771: done with get_vars() 13355 1727096167.89789: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:56:07 -0400 (0:00:00.032) 0:00:17.159 ****** 13355 1727096167.89863: entering _queue_task() for managed_node3/ping 13355 1727096167.89864: Creating lock for ping 13355 1727096167.90128: worker is 1 (out of 1 available) 13355 1727096167.90143: exiting _queue_task() for managed_node3/ping 13355 1727096167.90154: done queuing things up, now waiting for results queue to drain 13355 1727096167.90156: waiting for pending results... 13355 1727096167.90339: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096167.90425: in run() - task 0afff68d-5257-c514-593f-00000000003b 13355 1727096167.90436: variable 'ansible_search_path' from source: unknown 13355 1727096167.90439: variable 'ansible_search_path' from source: unknown 13355 1727096167.90472: calling self._execute() 13355 1727096167.90542: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.90546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.90556: variable 'omit' from source: magic vars 13355 1727096167.90852: variable 'ansible_distribution_major_version' from source: facts 13355 1727096167.90864: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096167.90871: variable 'omit' from source: magic vars 13355 1727096167.90912: variable 'omit' from source: magic vars 13355 1727096167.90941: variable 'omit' from source: magic vars 13355 1727096167.90977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096167.91004: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096167.91020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096167.91035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096167.91049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096167.91073: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096167.91077: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.91079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.91147: Set connection var ansible_shell_executable to /bin/sh 13355 1727096167.91150: Set connection var ansible_shell_type to sh 13355 1727096167.91154: Set connection var ansible_pipelining to False 13355 1727096167.91166: Set connection var ansible_connection to ssh 13355 1727096167.91171: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096167.91174: Set connection var ansible_timeout to 10 13355 1727096167.91191: variable 'ansible_shell_executable' from source: unknown 13355 1727096167.91194: variable 'ansible_connection' from source: unknown 13355 1727096167.91197: variable 'ansible_module_compression' from source: unknown 13355 1727096167.91199: variable 'ansible_shell_type' from source: unknown 13355 1727096167.91201: variable 'ansible_shell_executable' from source: unknown 13355 1727096167.91204: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096167.91206: variable 'ansible_pipelining' from source: unknown 13355 1727096167.91208: variable 'ansible_timeout' from source: unknown 13355 1727096167.91213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096167.91360: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096167.91371: variable 'omit' from source: magic vars 13355 1727096167.91387: starting attempt loop 13355 1727096167.91390: running the handler 13355 1727096167.91393: _low_level_execute_command(): starting 13355 1727096167.91400: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096167.91925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096167.91930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096167.91934: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096167.91988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096167.91991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096167.91997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096167.92038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096167.93696: stdout chunk (state=3): >>>/root <<< 13355 1727096167.93797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096167.93827: stderr chunk (state=3): >>><<< 13355 1727096167.93831: stdout chunk (state=3): >>><<< 13355 1727096167.93856: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096167.93866: _low_level_execute_command(): starting 13355 1727096167.93875: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991 `" && echo ansible-tmp-1727096167.9385204-14163-169440473641991="` echo /root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991 `" ) && sleep 0' 13355 1727096167.94349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096167.94353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096167.94357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096167.94366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096167.94404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096167.94408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096167.94411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096167.94470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096167.96471: stdout chunk (state=3): >>>ansible-tmp-1727096167.9385204-14163-169440473641991=/root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991 <<< 13355 1727096167.96577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096167.96606: stderr chunk (state=3): >>><<< 13355 1727096167.96610: stdout chunk (state=3): >>><<< 13355 1727096167.96628: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096167.9385204-14163-169440473641991=/root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096167.96675: variable 'ansible_module_compression' from source: unknown 13355 1727096167.96716: ANSIBALLZ: Using lock for ping 13355 1727096167.96720: ANSIBALLZ: Acquiring lock 13355 1727096167.96722: ANSIBALLZ: Lock acquired: 140397095748768 13355 1727096167.96725: ANSIBALLZ: Creating module 13355 1727096168.04709: ANSIBALLZ: Writing module into payload 13355 1727096168.04754: ANSIBALLZ: Writing module 13355 1727096168.04776: ANSIBALLZ: Renaming module 13355 1727096168.04782: ANSIBALLZ: Done creating module 13355 1727096168.04798: variable 'ansible_facts' from source: unknown 13355 1727096168.04844: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/AnsiballZ_ping.py 13355 1727096168.04950: Sending initial data 13355 1727096168.04955: Sent initial data (153 bytes) 13355 1727096168.05427: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.05431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.05435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096168.05437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.05487: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096168.05490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.05492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.05541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.07208: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096168.07243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096168.07277: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmplvbr5do1 /root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/AnsiballZ_ping.py <<< 13355 1727096168.07294: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/AnsiballZ_ping.py" <<< 13355 1727096168.07309: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmplvbr5do1" to remote "/root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/AnsiballZ_ping.py" <<< 13355 1727096168.07791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096168.07837: stderr chunk (state=3): >>><<< 13355 1727096168.07840: stdout chunk (state=3): >>><<< 13355 1727096168.07882: done transferring module to remote 13355 1727096168.07892: _low_level_execute_command(): starting 13355 1727096168.07896: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/ /root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/AnsiballZ_ping.py && sleep 0' 13355 1727096168.08378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.08381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.08384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096168.08386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096168.08392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.08432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096168.08435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.08439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.08477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.10320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096168.10341: stderr chunk (state=3): >>><<< 13355 1727096168.10345: stdout chunk (state=3): >>><<< 13355 1727096168.10361: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096168.10364: _low_level_execute_command(): starting 13355 1727096168.10370: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/AnsiballZ_ping.py && sleep 0' 13355 1727096168.10826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096168.10830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.10834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096168.10836: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.10838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.10891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.10895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.10943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.26428: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13355 1727096168.27839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096168.27869: stderr chunk (state=3): >>><<< 13355 1727096168.27873: stdout chunk (state=3): >>><<< 13355 1727096168.27887: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096168.27914: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096168.27922: _low_level_execute_command(): starting 13355 1727096168.27927: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096167.9385204-14163-169440473641991/ > /dev/null 2>&1 && sleep 0' 13355 1727096168.28390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.28393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.28395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.28397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.28455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096168.28464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.28471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.28499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.30456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096168.30461: stdout chunk (state=3): >>><<< 13355 1727096168.30464: stderr chunk (state=3): >>><<< 13355 1727096168.30674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096168.30678: handler run complete 13355 1727096168.30681: attempt loop complete, returning result 13355 1727096168.30683: _execute() done 13355 1727096168.30686: dumping result to json 13355 1727096168.30688: done dumping result, returning 13355 1727096168.30690: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-c514-593f-00000000003b] 13355 1727096168.30692: sending task result for task 0afff68d-5257-c514-593f-00000000003b 13355 1727096168.30761: done sending task result for task 0afff68d-5257-c514-593f-00000000003b 13355 1727096168.30764: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 13355 1727096168.30830: no more pending results, returning what we have 13355 1727096168.30834: results queue empty 13355 1727096168.30835: checking for any_errors_fatal 13355 1727096168.30841: done checking for any_errors_fatal 13355 1727096168.30842: checking for max_fail_percentage 13355 1727096168.30844: done checking for max_fail_percentage 13355 1727096168.30845: checking to see if all hosts have failed and the running result is not ok 13355 1727096168.30845: done checking to see if all hosts have failed 13355 1727096168.30846: getting the remaining hosts for this loop 13355 1727096168.30847: done getting the remaining hosts for this loop 13355 1727096168.30851: getting the next task for host managed_node3 13355 1727096168.30860: done getting next task for host managed_node3 13355 1727096168.30863: ^ task is: TASK: meta (role_complete) 13355 1727096168.30877: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096168.30890: getting variables 13355 1727096168.30892: in VariableManager get_vars() 13355 1727096168.30952: Calling all_inventory to load vars for managed_node3 13355 1727096168.30955: Calling groups_inventory to load vars for managed_node3 13355 1727096168.30958: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096168.31114: Calling all_plugins_play to load vars for managed_node3 13355 1727096168.31119: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096168.31122: Calling groups_plugins_play to load vars for managed_node3 13355 1727096168.32450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096168.34192: done with get_vars() 13355 1727096168.34220: done getting variables 13355 1727096168.34319: done queuing things up, now waiting for results queue to drain 13355 1727096168.34322: results queue empty 13355 1727096168.34323: checking for any_errors_fatal 13355 1727096168.34326: done checking for any_errors_fatal 13355 1727096168.34327: checking for max_fail_percentage 13355 1727096168.34328: done checking for max_fail_percentage 13355 1727096168.34328: checking to see if all hosts have failed and the running result is not ok 13355 1727096168.34329: done checking to see if all hosts have failed 13355 1727096168.34330: getting the remaining hosts for this loop 13355 1727096168.34331: done getting the remaining hosts for this loop 13355 1727096168.34333: getting the next task for host managed_node3 13355 1727096168.34339: done getting next task for host managed_node3 13355 1727096168.34341: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13355 1727096168.34344: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096168.34346: getting variables 13355 1727096168.34347: in VariableManager get_vars() 13355 1727096168.34376: Calling all_inventory to load vars for managed_node3 13355 1727096168.34379: Calling groups_inventory to load vars for managed_node3 13355 1727096168.34381: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096168.34387: Calling all_plugins_play to load vars for managed_node3 13355 1727096168.34389: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096168.34392: Calling groups_plugins_play to load vars for managed_node3 13355 1727096168.35309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096168.36178: done with get_vars() 13355 1727096168.36201: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Monday 23 September 2024 08:56:08 -0400 (0:00:00.463) 0:00:17.623 ****** 13355 1727096168.36262: entering _queue_task() for managed_node3/include_tasks 13355 1727096168.36535: worker is 1 (out of 1 available) 13355 1727096168.36548: exiting _queue_task() for managed_node3/include_tasks 13355 1727096168.36563: done queuing things up, now waiting for results queue to drain 13355 1727096168.36565: waiting for pending results... 13355 1727096168.36739: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 13355 1727096168.36826: in run() - task 0afff68d-5257-c514-593f-00000000006e 13355 1727096168.36838: variable 'ansible_search_path' from source: unknown 13355 1727096168.36841: variable 'ansible_search_path' from source: unknown 13355 1727096168.36872: calling self._execute() 13355 1727096168.36956: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.36960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.36971: variable 'omit' from source: magic vars 13355 1727096168.37256: variable 'ansible_distribution_major_version' from source: facts 13355 1727096168.37274: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096168.37280: _execute() done 13355 1727096168.37283: dumping result to json 13355 1727096168.37288: done dumping result, returning 13355 1727096168.37294: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [0afff68d-5257-c514-593f-00000000006e] 13355 1727096168.37299: sending task result for task 0afff68d-5257-c514-593f-00000000006e 13355 1727096168.37394: done sending task result for task 0afff68d-5257-c514-593f-00000000006e 13355 1727096168.37397: WORKER PROCESS EXITING 13355 1727096168.37425: no more pending results, returning what we have 13355 1727096168.37431: in VariableManager get_vars() 13355 1727096168.37495: Calling all_inventory to load vars for managed_node3 13355 1727096168.37498: Calling groups_inventory to load vars for managed_node3 13355 1727096168.37501: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096168.37513: Calling all_plugins_play to load vars for managed_node3 13355 1727096168.37515: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096168.37518: Calling groups_plugins_play to load vars for managed_node3 13355 1727096168.38459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096168.39314: done with get_vars() 13355 1727096168.39330: variable 'ansible_search_path' from source: unknown 13355 1727096168.39332: variable 'ansible_search_path' from source: unknown 13355 1727096168.39363: we have included files to process 13355 1727096168.39364: generating all_blocks data 13355 1727096168.39365: done generating all_blocks data 13355 1727096168.39370: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096168.39371: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096168.39373: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13355 1727096168.39508: done processing included file 13355 1727096168.39510: iterating over new_blocks loaded from include file 13355 1727096168.39511: in VariableManager get_vars() 13355 1727096168.39532: done with get_vars() 13355 1727096168.39534: filtering new block on tags 13355 1727096168.39546: done filtering new block on tags 13355 1727096168.39547: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 13355 1727096168.39551: extending task lists for all hosts with included blocks 13355 1727096168.39612: done extending task lists 13355 1727096168.39613: done processing included files 13355 1727096168.39613: results queue empty 13355 1727096168.39614: checking for any_errors_fatal 13355 1727096168.39615: done checking for any_errors_fatal 13355 1727096168.39615: checking for max_fail_percentage 13355 1727096168.39616: done checking for max_fail_percentage 13355 1727096168.39616: checking to see if all hosts have failed and the running result is not ok 13355 1727096168.39617: done checking to see if all hosts have failed 13355 1727096168.39617: getting the remaining hosts for this loop 13355 1727096168.39618: done getting the remaining hosts for this loop 13355 1727096168.39620: getting the next task for host managed_node3 13355 1727096168.39622: done getting next task for host managed_node3 13355 1727096168.39623: ^ task is: TASK: Get stat for interface {{ interface }} 13355 1727096168.39625: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096168.39627: getting variables 13355 1727096168.39628: in VariableManager get_vars() 13355 1727096168.39643: Calling all_inventory to load vars for managed_node3 13355 1727096168.39645: Calling groups_inventory to load vars for managed_node3 13355 1727096168.39646: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096168.39651: Calling all_plugins_play to load vars for managed_node3 13355 1727096168.39655: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096168.39656: Calling groups_plugins_play to load vars for managed_node3 13355 1727096168.40313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096168.41169: done with get_vars() 13355 1727096168.41189: done getting variables 13355 1727096168.41311: variable 'interface' from source: task vars 13355 1727096168.41314: variable 'controller_device' from source: play vars 13355 1727096168.41356: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Monday 23 September 2024 08:56:08 -0400 (0:00:00.051) 0:00:17.674 ****** 13355 1727096168.41385: entering _queue_task() for managed_node3/stat 13355 1727096168.41656: worker is 1 (out of 1 available) 13355 1727096168.41671: exiting _queue_task() for managed_node3/stat 13355 1727096168.41683: done queuing things up, now waiting for results queue to drain 13355 1727096168.41686: waiting for pending results... 13355 1727096168.41859: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 13355 1727096168.41948: in run() - task 0afff68d-5257-c514-593f-000000000337 13355 1727096168.41960: variable 'ansible_search_path' from source: unknown 13355 1727096168.41964: variable 'ansible_search_path' from source: unknown 13355 1727096168.41996: calling self._execute() 13355 1727096168.42070: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.42074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.42082: variable 'omit' from source: magic vars 13355 1727096168.42351: variable 'ansible_distribution_major_version' from source: facts 13355 1727096168.42370: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096168.42377: variable 'omit' from source: magic vars 13355 1727096168.42412: variable 'omit' from source: magic vars 13355 1727096168.42487: variable 'interface' from source: task vars 13355 1727096168.42490: variable 'controller_device' from source: play vars 13355 1727096168.42536: variable 'controller_device' from source: play vars 13355 1727096168.42551: variable 'omit' from source: magic vars 13355 1727096168.42591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096168.42619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096168.42636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096168.42649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096168.42660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096168.42689: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096168.42693: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.42695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.42763: Set connection var ansible_shell_executable to /bin/sh 13355 1727096168.42769: Set connection var ansible_shell_type to sh 13355 1727096168.42775: Set connection var ansible_pipelining to False 13355 1727096168.42784: Set connection var ansible_connection to ssh 13355 1727096168.42787: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096168.42789: Set connection var ansible_timeout to 10 13355 1727096168.42811: variable 'ansible_shell_executable' from source: unknown 13355 1727096168.42814: variable 'ansible_connection' from source: unknown 13355 1727096168.42817: variable 'ansible_module_compression' from source: unknown 13355 1727096168.42819: variable 'ansible_shell_type' from source: unknown 13355 1727096168.42821: variable 'ansible_shell_executable' from source: unknown 13355 1727096168.42823: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.42826: variable 'ansible_pipelining' from source: unknown 13355 1727096168.42828: variable 'ansible_timeout' from source: unknown 13355 1727096168.42833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.42985: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096168.42996: variable 'omit' from source: magic vars 13355 1727096168.43003: starting attempt loop 13355 1727096168.43006: running the handler 13355 1727096168.43019: _low_level_execute_command(): starting 13355 1727096168.43027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096168.43543: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.43549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096168.43557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.43607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096168.43611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.43613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.43662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.45365: stdout chunk (state=3): >>>/root <<< 13355 1727096168.45456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096168.45489: stderr chunk (state=3): >>><<< 13355 1727096168.45494: stdout chunk (state=3): >>><<< 13355 1727096168.45520: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096168.45532: _low_level_execute_command(): starting 13355 1727096168.45538: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745 `" && echo ansible-tmp-1727096168.4552035-14179-176577330835745="` echo /root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745 `" ) && sleep 0' 13355 1727096168.46004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096168.46008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096168.46019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13355 1727096168.46021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.46024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.46079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.46082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.46114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.48113: stdout chunk (state=3): >>>ansible-tmp-1727096168.4552035-14179-176577330835745=/root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745 <<< 13355 1727096168.48212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096168.48241: stderr chunk (state=3): >>><<< 13355 1727096168.48244: stdout chunk (state=3): >>><<< 13355 1727096168.48261: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096168.4552035-14179-176577330835745=/root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096168.48308: variable 'ansible_module_compression' from source: unknown 13355 1727096168.48359: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13355 1727096168.48391: variable 'ansible_facts' from source: unknown 13355 1727096168.48457: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/AnsiballZ_stat.py 13355 1727096168.48561: Sending initial data 13355 1727096168.48564: Sent initial data (153 bytes) 13355 1727096168.49044: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.49049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.49051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096168.49057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096168.49059: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.49106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096168.49109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.49116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.49149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.50772: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096168.50798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096168.50826: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpt5tlls86 /root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/AnsiballZ_stat.py <<< 13355 1727096168.50833: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/AnsiballZ_stat.py" <<< 13355 1727096168.50861: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpt5tlls86" to remote "/root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/AnsiballZ_stat.py" <<< 13355 1727096168.50866: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/AnsiballZ_stat.py" <<< 13355 1727096168.51357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096168.51405: stderr chunk (state=3): >>><<< 13355 1727096168.51408: stdout chunk (state=3): >>><<< 13355 1727096168.51443: done transferring module to remote 13355 1727096168.51452: _low_level_execute_command(): starting 13355 1727096168.51459: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/ /root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/AnsiballZ_stat.py && sleep 0' 13355 1727096168.51926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096168.51930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.51932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096168.51939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096168.51941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.51993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096168.52002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.52005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.52038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.53909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096168.53934: stderr chunk (state=3): >>><<< 13355 1727096168.53937: stdout chunk (state=3): >>><<< 13355 1727096168.53956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096168.53961: _low_level_execute_command(): starting 13355 1727096168.53968: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/AnsiballZ_stat.py && sleep 0' 13355 1727096168.54430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096168.54433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.54436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096168.54438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096168.54440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.54494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096168.54502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.54506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.54544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.70390: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29576, "dev": 23, "nlink": 1, "atime": 1727096167.495849, "mtime": 1727096167.495849, "ctime": 1727096167.495849, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13355 1727096168.71855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096168.71884: stderr chunk (state=3): >>><<< 13355 1727096168.71888: stdout chunk (state=3): >>><<< 13355 1727096168.71905: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29576, "dev": 23, "nlink": 1, "atime": 1727096167.495849, "mtime": 1727096167.495849, "ctime": 1727096167.495849, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096168.71944: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096168.71952: _low_level_execute_command(): starting 13355 1727096168.71961: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096168.4552035-14179-176577330835745/ > /dev/null 2>&1 && sleep 0' 13355 1727096168.72432: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.72436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.72438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096168.72440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096168.72497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096168.72504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096168.72506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096168.72541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096168.74678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096168.74682: stdout chunk (state=3): >>><<< 13355 1727096168.74684: stderr chunk (state=3): >>><<< 13355 1727096168.74687: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096168.74689: handler run complete 13355 1727096168.74691: attempt loop complete, returning result 13355 1727096168.74693: _execute() done 13355 1727096168.74695: dumping result to json 13355 1727096168.74697: done dumping result, returning 13355 1727096168.74699: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [0afff68d-5257-c514-593f-000000000337] 13355 1727096168.74701: sending task result for task 0afff68d-5257-c514-593f-000000000337 13355 1727096168.74783: done sending task result for task 0afff68d-5257-c514-593f-000000000337 13355 1727096168.74787: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1727096167.495849, "block_size": 4096, "blocks": 0, "ctime": 1727096167.495849, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29576, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727096167.495849, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13355 1727096168.74895: no more pending results, returning what we have 13355 1727096168.74903: results queue empty 13355 1727096168.74904: checking for any_errors_fatal 13355 1727096168.74906: done checking for any_errors_fatal 13355 1727096168.74907: checking for max_fail_percentage 13355 1727096168.74909: done checking for max_fail_percentage 13355 1727096168.74909: checking to see if all hosts have failed and the running result is not ok 13355 1727096168.74910: done checking to see if all hosts have failed 13355 1727096168.74911: getting the remaining hosts for this loop 13355 1727096168.74913: done getting the remaining hosts for this loop 13355 1727096168.74918: getting the next task for host managed_node3 13355 1727096168.74925: done getting next task for host managed_node3 13355 1727096168.74928: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13355 1727096168.74931: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096168.74935: getting variables 13355 1727096168.74936: in VariableManager get_vars() 13355 1727096168.75002: Calling all_inventory to load vars for managed_node3 13355 1727096168.75005: Calling groups_inventory to load vars for managed_node3 13355 1727096168.75007: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096168.75020: Calling all_plugins_play to load vars for managed_node3 13355 1727096168.75024: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096168.75027: Calling groups_plugins_play to load vars for managed_node3 13355 1727096168.76263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096168.77135: done with get_vars() 13355 1727096168.77158: done getting variables 13355 1727096168.77204: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096168.77294: variable 'interface' from source: task vars 13355 1727096168.77297: variable 'controller_device' from source: play vars 13355 1727096168.77336: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Monday 23 September 2024 08:56:08 -0400 (0:00:00.359) 0:00:18.034 ****** 13355 1727096168.77366: entering _queue_task() for managed_node3/assert 13355 1727096168.77628: worker is 1 (out of 1 available) 13355 1727096168.77641: exiting _queue_task() for managed_node3/assert 13355 1727096168.77655: done queuing things up, now waiting for results queue to drain 13355 1727096168.77656: waiting for pending results... 13355 1727096168.77839: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 13355 1727096168.77928: in run() - task 0afff68d-5257-c514-593f-00000000006f 13355 1727096168.77940: variable 'ansible_search_path' from source: unknown 13355 1727096168.77944: variable 'ansible_search_path' from source: unknown 13355 1727096168.77976: calling self._execute() 13355 1727096168.78061: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.78065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.78094: variable 'omit' from source: magic vars 13355 1727096168.78494: variable 'ansible_distribution_major_version' from source: facts 13355 1727096168.78497: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096168.78500: variable 'omit' from source: magic vars 13355 1727096168.78502: variable 'omit' from source: magic vars 13355 1727096168.78633: variable 'interface' from source: task vars 13355 1727096168.78637: variable 'controller_device' from source: play vars 13355 1727096168.78671: variable 'controller_device' from source: play vars 13355 1727096168.78710: variable 'omit' from source: magic vars 13355 1727096168.78734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096168.78766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096168.78820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096168.78823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096168.78825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096168.78841: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096168.78844: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.78849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.78950: Set connection var ansible_shell_executable to /bin/sh 13355 1727096168.78959: Set connection var ansible_shell_type to sh 13355 1727096168.78965: Set connection var ansible_pipelining to False 13355 1727096168.78971: Set connection var ansible_connection to ssh 13355 1727096168.78977: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096168.78983: Set connection var ansible_timeout to 10 13355 1727096168.79009: variable 'ansible_shell_executable' from source: unknown 13355 1727096168.79012: variable 'ansible_connection' from source: unknown 13355 1727096168.79016: variable 'ansible_module_compression' from source: unknown 13355 1727096168.79018: variable 'ansible_shell_type' from source: unknown 13355 1727096168.79020: variable 'ansible_shell_executable' from source: unknown 13355 1727096168.79022: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.79025: variable 'ansible_pipelining' from source: unknown 13355 1727096168.79027: variable 'ansible_timeout' from source: unknown 13355 1727096168.79039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.79173: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096168.79184: variable 'omit' from source: magic vars 13355 1727096168.79254: starting attempt loop 13355 1727096168.79258: running the handler 13355 1727096168.79323: variable 'interface_stat' from source: set_fact 13355 1727096168.79343: Evaluated conditional (interface_stat.stat.exists): True 13355 1727096168.79348: handler run complete 13355 1727096168.79365: attempt loop complete, returning result 13355 1727096168.79374: _execute() done 13355 1727096168.79384: dumping result to json 13355 1727096168.79387: done dumping result, returning 13355 1727096168.79390: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [0afff68d-5257-c514-593f-00000000006f] 13355 1727096168.79392: sending task result for task 0afff68d-5257-c514-593f-00000000006f ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096168.79628: no more pending results, returning what we have 13355 1727096168.79631: results queue empty 13355 1727096168.79632: checking for any_errors_fatal 13355 1727096168.79638: done checking for any_errors_fatal 13355 1727096168.79639: checking for max_fail_percentage 13355 1727096168.79640: done checking for max_fail_percentage 13355 1727096168.79641: checking to see if all hosts have failed and the running result is not ok 13355 1727096168.79642: done checking to see if all hosts have failed 13355 1727096168.79642: getting the remaining hosts for this loop 13355 1727096168.79643: done getting the remaining hosts for this loop 13355 1727096168.79646: getting the next task for host managed_node3 13355 1727096168.79652: done getting next task for host managed_node3 13355 1727096168.79655: ^ task is: TASK: Include the task 'assert_profile_present.yml' 13355 1727096168.79656: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096168.79660: getting variables 13355 1727096168.79661: in VariableManager get_vars() 13355 1727096168.79787: Calling all_inventory to load vars for managed_node3 13355 1727096168.79790: Calling groups_inventory to load vars for managed_node3 13355 1727096168.79793: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096168.79803: Calling all_plugins_play to load vars for managed_node3 13355 1727096168.79806: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096168.79809: Calling groups_plugins_play to load vars for managed_node3 13355 1727096168.80451: done sending task result for task 0afff68d-5257-c514-593f-00000000006f 13355 1727096168.80455: WORKER PROCESS EXITING 13355 1727096168.81984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096168.83698: done with get_vars() 13355 1727096168.83722: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:67 Monday 23 September 2024 08:56:08 -0400 (0:00:00.064) 0:00:18.098 ****** 13355 1727096168.83825: entering _queue_task() for managed_node3/include_tasks 13355 1727096168.84278: worker is 1 (out of 1 available) 13355 1727096168.84292: exiting _queue_task() for managed_node3/include_tasks 13355 1727096168.84304: done queuing things up, now waiting for results queue to drain 13355 1727096168.84305: waiting for pending results... 13355 1727096168.84977: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 13355 1727096168.85341: in run() - task 0afff68d-5257-c514-593f-000000000070 13355 1727096168.85345: variable 'ansible_search_path' from source: unknown 13355 1727096168.85348: variable 'controller_profile' from source: play vars 13355 1727096168.85509: variable 'controller_profile' from source: play vars 13355 1727096168.85529: variable 'port1_profile' from source: play vars 13355 1727096168.85597: variable 'port1_profile' from source: play vars 13355 1727096168.85604: variable 'port2_profile' from source: play vars 13355 1727096168.85673: variable 'port2_profile' from source: play vars 13355 1727096168.85685: variable 'omit' from source: magic vars 13355 1727096168.85817: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.85825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.85842: variable 'omit' from source: magic vars 13355 1727096168.86314: variable 'ansible_distribution_major_version' from source: facts 13355 1727096168.86322: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096168.86354: variable 'item' from source: unknown 13355 1727096168.86525: variable 'item' from source: unknown 13355 1727096168.86661: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.86665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.86726: variable 'omit' from source: magic vars 13355 1727096168.87038: variable 'ansible_distribution_major_version' from source: facts 13355 1727096168.87044: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096168.87073: variable 'item' from source: unknown 13355 1727096168.87200: variable 'item' from source: unknown 13355 1727096168.87441: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096168.87444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096168.87447: variable 'omit' from source: magic vars 13355 1727096168.87774: variable 'ansible_distribution_major_version' from source: facts 13355 1727096168.87778: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096168.87780: variable 'item' from source: unknown 13355 1727096168.87923: variable 'item' from source: unknown 13355 1727096168.88101: dumping result to json 13355 1727096168.88105: done dumping result, returning 13355 1727096168.88112: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [0afff68d-5257-c514-593f-000000000070] 13355 1727096168.88115: sending task result for task 0afff68d-5257-c514-593f-000000000070 13355 1727096168.88158: done sending task result for task 0afff68d-5257-c514-593f-000000000070 13355 1727096168.88162: WORKER PROCESS EXITING 13355 1727096168.88248: no more pending results, returning what we have 13355 1727096168.88253: in VariableManager get_vars() 13355 1727096168.88323: Calling all_inventory to load vars for managed_node3 13355 1727096168.88326: Calling groups_inventory to load vars for managed_node3 13355 1727096168.88328: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096168.88342: Calling all_plugins_play to load vars for managed_node3 13355 1727096168.88346: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096168.88349: Calling groups_plugins_play to load vars for managed_node3 13355 1727096168.95763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096168.97868: done with get_vars() 13355 1727096168.97890: variable 'ansible_search_path' from source: unknown 13355 1727096168.97904: variable 'ansible_search_path' from source: unknown 13355 1727096168.97910: variable 'ansible_search_path' from source: unknown 13355 1727096168.97914: we have included files to process 13355 1727096168.97915: generating all_blocks data 13355 1727096168.97916: done generating all_blocks data 13355 1727096168.97918: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.97919: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.97920: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.98039: in VariableManager get_vars() 13355 1727096168.98069: done with get_vars() 13355 1727096168.98236: done processing included file 13355 1727096168.98237: iterating over new_blocks loaded from include file 13355 1727096168.98238: in VariableManager get_vars() 13355 1727096168.98255: done with get_vars() 13355 1727096168.98256: filtering new block on tags 13355 1727096168.98270: done filtering new block on tags 13355 1727096168.98272: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 13355 1727096168.98275: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.98275: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.98277: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.98338: in VariableManager get_vars() 13355 1727096168.98357: done with get_vars() 13355 1727096168.98501: done processing included file 13355 1727096168.98503: iterating over new_blocks loaded from include file 13355 1727096168.98504: in VariableManager get_vars() 13355 1727096168.98518: done with get_vars() 13355 1727096168.98519: filtering new block on tags 13355 1727096168.98529: done filtering new block on tags 13355 1727096168.98531: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 13355 1727096168.98533: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.98534: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.98536: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13355 1727096168.98644: in VariableManager get_vars() 13355 1727096168.98665: done with get_vars() 13355 1727096168.98810: done processing included file 13355 1727096168.98812: iterating over new_blocks loaded from include file 13355 1727096168.98812: in VariableManager get_vars() 13355 1727096168.98826: done with get_vars() 13355 1727096168.98828: filtering new block on tags 13355 1727096168.98838: done filtering new block on tags 13355 1727096168.98840: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 13355 1727096168.98842: extending task lists for all hosts with included blocks 13355 1727096169.03321: done extending task lists 13355 1727096169.03328: done processing included files 13355 1727096169.03328: results queue empty 13355 1727096169.03329: checking for any_errors_fatal 13355 1727096169.03331: done checking for any_errors_fatal 13355 1727096169.03332: checking for max_fail_percentage 13355 1727096169.03332: done checking for max_fail_percentage 13355 1727096169.03333: checking to see if all hosts have failed and the running result is not ok 13355 1727096169.03333: done checking to see if all hosts have failed 13355 1727096169.03334: getting the remaining hosts for this loop 13355 1727096169.03335: done getting the remaining hosts for this loop 13355 1727096169.03336: getting the next task for host managed_node3 13355 1727096169.03340: done getting next task for host managed_node3 13355 1727096169.03341: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13355 1727096169.03343: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096169.03344: getting variables 13355 1727096169.03345: in VariableManager get_vars() 13355 1727096169.03364: Calling all_inventory to load vars for managed_node3 13355 1727096169.03366: Calling groups_inventory to load vars for managed_node3 13355 1727096169.03369: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096169.03374: Calling all_plugins_play to load vars for managed_node3 13355 1727096169.03376: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096169.03377: Calling groups_plugins_play to load vars for managed_node3 13355 1727096169.04095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096169.04953: done with get_vars() 13355 1727096169.04984: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:56:09 -0400 (0:00:00.212) 0:00:18.311 ****** 13355 1727096169.05069: entering _queue_task() for managed_node3/include_tasks 13355 1727096169.05446: worker is 1 (out of 1 available) 13355 1727096169.05459: exiting _queue_task() for managed_node3/include_tasks 13355 1727096169.05480: done queuing things up, now waiting for results queue to drain 13355 1727096169.05482: waiting for pending results... 13355 1727096169.05946: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 13355 1727096169.05954: in run() - task 0afff68d-5257-c514-593f-000000000355 13355 1727096169.05958: variable 'ansible_search_path' from source: unknown 13355 1727096169.05961: variable 'ansible_search_path' from source: unknown 13355 1727096169.05964: calling self._execute() 13355 1727096169.06015: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.06021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.06037: variable 'omit' from source: magic vars 13355 1727096169.06430: variable 'ansible_distribution_major_version' from source: facts 13355 1727096169.06441: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096169.06447: _execute() done 13355 1727096169.06450: dumping result to json 13355 1727096169.06455: done dumping result, returning 13355 1727096169.06462: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-c514-593f-000000000355] 13355 1727096169.06469: sending task result for task 0afff68d-5257-c514-593f-000000000355 13355 1727096169.06569: done sending task result for task 0afff68d-5257-c514-593f-000000000355 13355 1727096169.06572: WORKER PROCESS EXITING 13355 1727096169.06601: no more pending results, returning what we have 13355 1727096169.06607: in VariableManager get_vars() 13355 1727096169.06677: Calling all_inventory to load vars for managed_node3 13355 1727096169.06681: Calling groups_inventory to load vars for managed_node3 13355 1727096169.06683: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096169.06700: Calling all_plugins_play to load vars for managed_node3 13355 1727096169.06704: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096169.06707: Calling groups_plugins_play to load vars for managed_node3 13355 1727096169.08261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096169.09918: done with get_vars() 13355 1727096169.09933: variable 'ansible_search_path' from source: unknown 13355 1727096169.09934: variable 'ansible_search_path' from source: unknown 13355 1727096169.09964: we have included files to process 13355 1727096169.09965: generating all_blocks data 13355 1727096169.09966: done generating all_blocks data 13355 1727096169.09969: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096169.09970: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096169.09972: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096169.10647: done processing included file 13355 1727096169.10649: iterating over new_blocks loaded from include file 13355 1727096169.10650: in VariableManager get_vars() 13355 1727096169.10672: done with get_vars() 13355 1727096169.10673: filtering new block on tags 13355 1727096169.10689: done filtering new block on tags 13355 1727096169.10691: in VariableManager get_vars() 13355 1727096169.10706: done with get_vars() 13355 1727096169.10707: filtering new block on tags 13355 1727096169.10719: done filtering new block on tags 13355 1727096169.10720: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 13355 1727096169.10724: extending task lists for all hosts with included blocks 13355 1727096169.10825: done extending task lists 13355 1727096169.10826: done processing included files 13355 1727096169.10826: results queue empty 13355 1727096169.10827: checking for any_errors_fatal 13355 1727096169.10830: done checking for any_errors_fatal 13355 1727096169.10830: checking for max_fail_percentage 13355 1727096169.10831: done checking for max_fail_percentage 13355 1727096169.10831: checking to see if all hosts have failed and the running result is not ok 13355 1727096169.10832: done checking to see if all hosts have failed 13355 1727096169.10833: getting the remaining hosts for this loop 13355 1727096169.10834: done getting the remaining hosts for this loop 13355 1727096169.10835: getting the next task for host managed_node3 13355 1727096169.10837: done getting next task for host managed_node3 13355 1727096169.10839: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13355 1727096169.10841: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096169.10843: getting variables 13355 1727096169.10844: in VariableManager get_vars() 13355 1727096169.10906: Calling all_inventory to load vars for managed_node3 13355 1727096169.10908: Calling groups_inventory to load vars for managed_node3 13355 1727096169.10910: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096169.10914: Calling all_plugins_play to load vars for managed_node3 13355 1727096169.10916: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096169.10917: Calling groups_plugins_play to load vars for managed_node3 13355 1727096169.11831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096169.13408: done with get_vars() 13355 1727096169.13444: done getting variables 13355 1727096169.13488: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:56:09 -0400 (0:00:00.084) 0:00:18.395 ****** 13355 1727096169.13519: entering _queue_task() for managed_node3/set_fact 13355 1727096169.13972: worker is 1 (out of 1 available) 13355 1727096169.13997: exiting _queue_task() for managed_node3/set_fact 13355 1727096169.14013: done queuing things up, now waiting for results queue to drain 13355 1727096169.14014: waiting for pending results... 13355 1727096169.14341: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13355 1727096169.14346: in run() - task 0afff68d-5257-c514-593f-0000000005e4 13355 1727096169.14350: variable 'ansible_search_path' from source: unknown 13355 1727096169.14355: variable 'ansible_search_path' from source: unknown 13355 1727096169.14387: calling self._execute() 13355 1727096169.14478: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.14482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.14501: variable 'omit' from source: magic vars 13355 1727096169.14873: variable 'ansible_distribution_major_version' from source: facts 13355 1727096169.14879: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096169.14890: variable 'omit' from source: magic vars 13355 1727096169.14974: variable 'omit' from source: magic vars 13355 1727096169.14983: variable 'omit' from source: magic vars 13355 1727096169.15026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096169.15066: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096169.15095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096169.15119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096169.15173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096169.15176: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096169.15179: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.15181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.15281: Set connection var ansible_shell_executable to /bin/sh 13355 1727096169.15293: Set connection var ansible_shell_type to sh 13355 1727096169.15304: Set connection var ansible_pipelining to False 13355 1727096169.15313: Set connection var ansible_connection to ssh 13355 1727096169.15324: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096169.15473: Set connection var ansible_timeout to 10 13355 1727096169.15476: variable 'ansible_shell_executable' from source: unknown 13355 1727096169.15479: variable 'ansible_connection' from source: unknown 13355 1727096169.15481: variable 'ansible_module_compression' from source: unknown 13355 1727096169.15484: variable 'ansible_shell_type' from source: unknown 13355 1727096169.15486: variable 'ansible_shell_executable' from source: unknown 13355 1727096169.15488: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.15490: variable 'ansible_pipelining' from source: unknown 13355 1727096169.15492: variable 'ansible_timeout' from source: unknown 13355 1727096169.15494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.15557: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096169.15582: variable 'omit' from source: magic vars 13355 1727096169.15595: starting attempt loop 13355 1727096169.15598: running the handler 13355 1727096169.15607: handler run complete 13355 1727096169.15616: attempt loop complete, returning result 13355 1727096169.15618: _execute() done 13355 1727096169.15621: dumping result to json 13355 1727096169.15623: done dumping result, returning 13355 1727096169.15630: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-c514-593f-0000000005e4] 13355 1727096169.15636: sending task result for task 0afff68d-5257-c514-593f-0000000005e4 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13355 1727096169.15798: no more pending results, returning what we have 13355 1727096169.15801: results queue empty 13355 1727096169.15802: checking for any_errors_fatal 13355 1727096169.15804: done checking for any_errors_fatal 13355 1727096169.15805: checking for max_fail_percentage 13355 1727096169.15806: done checking for max_fail_percentage 13355 1727096169.15807: checking to see if all hosts have failed and the running result is not ok 13355 1727096169.15807: done checking to see if all hosts have failed 13355 1727096169.15808: getting the remaining hosts for this loop 13355 1727096169.15809: done getting the remaining hosts for this loop 13355 1727096169.15813: getting the next task for host managed_node3 13355 1727096169.15820: done getting next task for host managed_node3 13355 1727096169.15823: ^ task is: TASK: Stat profile file 13355 1727096169.15827: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096169.15832: getting variables 13355 1727096169.15834: in VariableManager get_vars() 13355 1727096169.15889: Calling all_inventory to load vars for managed_node3 13355 1727096169.15892: Calling groups_inventory to load vars for managed_node3 13355 1727096169.15894: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096169.15903: Calling all_plugins_play to load vars for managed_node3 13355 1727096169.15905: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096169.15908: Calling groups_plugins_play to load vars for managed_node3 13355 1727096169.16814: done sending task result for task 0afff68d-5257-c514-593f-0000000005e4 13355 1727096169.16819: WORKER PROCESS EXITING 13355 1727096169.16830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096169.17910: done with get_vars() 13355 1727096169.17940: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:56:09 -0400 (0:00:00.045) 0:00:18.441 ****** 13355 1727096169.18046: entering _queue_task() for managed_node3/stat 13355 1727096169.18410: worker is 1 (out of 1 available) 13355 1727096169.18421: exiting _queue_task() for managed_node3/stat 13355 1727096169.18436: done queuing things up, now waiting for results queue to drain 13355 1727096169.18437: waiting for pending results... 13355 1727096169.18765: running TaskExecutor() for managed_node3/TASK: Stat profile file 13355 1727096169.18829: in run() - task 0afff68d-5257-c514-593f-0000000005e5 13355 1727096169.18874: variable 'ansible_search_path' from source: unknown 13355 1727096169.18883: variable 'ansible_search_path' from source: unknown 13355 1727096169.18939: calling self._execute() 13355 1727096169.19047: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.19080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.19085: variable 'omit' from source: magic vars 13355 1727096169.19743: variable 'ansible_distribution_major_version' from source: facts 13355 1727096169.19823: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096169.19961: variable 'omit' from source: magic vars 13355 1727096169.19965: variable 'omit' from source: magic vars 13355 1727096169.19969: variable 'profile' from source: include params 13355 1727096169.19972: variable 'item' from source: include params 13355 1727096169.20008: variable 'item' from source: include params 13355 1727096169.20028: variable 'omit' from source: magic vars 13355 1727096169.20073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096169.20113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096169.20179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096169.20182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096169.20184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096169.20190: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096169.20198: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.20202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.20303: Set connection var ansible_shell_executable to /bin/sh 13355 1727096169.20316: Set connection var ansible_shell_type to sh 13355 1727096169.20321: Set connection var ansible_pipelining to False 13355 1727096169.20396: Set connection var ansible_connection to ssh 13355 1727096169.20399: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096169.20402: Set connection var ansible_timeout to 10 13355 1727096169.20404: variable 'ansible_shell_executable' from source: unknown 13355 1727096169.20407: variable 'ansible_connection' from source: unknown 13355 1727096169.20409: variable 'ansible_module_compression' from source: unknown 13355 1727096169.20411: variable 'ansible_shell_type' from source: unknown 13355 1727096169.20413: variable 'ansible_shell_executable' from source: unknown 13355 1727096169.20415: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.20418: variable 'ansible_pipelining' from source: unknown 13355 1727096169.20421: variable 'ansible_timeout' from source: unknown 13355 1727096169.20423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.20597: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096169.20612: variable 'omit' from source: magic vars 13355 1727096169.20616: starting attempt loop 13355 1727096169.20618: running the handler 13355 1727096169.20632: _low_level_execute_command(): starting 13355 1727096169.20640: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096169.21378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096169.21389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.21401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.21487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.21519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.21536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.21554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.21628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.23369: stdout chunk (state=3): >>>/root <<< 13355 1727096169.23487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.23558: stderr chunk (state=3): >>><<< 13355 1727096169.23561: stdout chunk (state=3): >>><<< 13355 1727096169.23588: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096169.23698: _low_level_execute_command(): starting 13355 1727096169.23702: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183 `" && echo ansible-tmp-1727096169.235963-14210-52003952133183="` echo /root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183 `" ) && sleep 0' 13355 1727096169.24431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096169.24447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.24463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.24494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096169.24543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096169.24663: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.24688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.24772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.26732: stdout chunk (state=3): >>>ansible-tmp-1727096169.235963-14210-52003952133183=/root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183 <<< 13355 1727096169.26858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.26866: stdout chunk (state=3): >>><<< 13355 1727096169.26883: stderr chunk (state=3): >>><<< 13355 1727096169.26923: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096169.235963-14210-52003952133183=/root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096169.26963: variable 'ansible_module_compression' from source: unknown 13355 1727096169.27089: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13355 1727096169.27092: variable 'ansible_facts' from source: unknown 13355 1727096169.27157: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/AnsiballZ_stat.py 13355 1727096169.27473: Sending initial data 13355 1727096169.27477: Sent initial data (151 bytes) 13355 1727096169.27928: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096169.27982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.28036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.28040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.28070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.28160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.29851: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096169.29887: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096169.29929: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpt52mkq8y /root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/AnsiballZ_stat.py <<< 13355 1727096169.29932: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/AnsiballZ_stat.py" <<< 13355 1727096169.29965: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpt52mkq8y" to remote "/root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/AnsiballZ_stat.py" <<< 13355 1727096169.29970: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/AnsiballZ_stat.py" <<< 13355 1727096169.30758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.30761: stderr chunk (state=3): >>><<< 13355 1727096169.30763: stdout chunk (state=3): >>><<< 13355 1727096169.30775: done transferring module to remote 13355 1727096169.30787: _low_level_execute_command(): starting 13355 1727096169.30792: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/ /root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/AnsiballZ_stat.py && sleep 0' 13355 1727096169.31428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.31537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.31565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.33387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.33420: stderr chunk (state=3): >>><<< 13355 1727096169.33424: stdout chunk (state=3): >>><<< 13355 1727096169.33492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096169.33495: _low_level_execute_command(): starting 13355 1727096169.33500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/AnsiballZ_stat.py && sleep 0' 13355 1727096169.34123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096169.34126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.34129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.34131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096169.34134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096169.34136: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096169.34138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.34140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096169.34143: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096169.34150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096169.34161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.34178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.34237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096169.34240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096169.34242: stderr chunk (state=3): >>>debug2: match found <<< 13355 1727096169.34244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.34283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.34296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.34313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.34375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.50007: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13355 1727096169.51433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.51437: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 13355 1727096169.51574: stderr chunk (state=3): >>><<< 13355 1727096169.51577: stdout chunk (state=3): >>><<< 13355 1727096169.51580: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096169.51582: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096169.51584: _low_level_execute_command(): starting 13355 1727096169.51593: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096169.235963-14210-52003952133183/ > /dev/null 2>&1 && sleep 0' 13355 1727096169.52242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096169.52280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.52304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.52347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.52410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.52434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.52456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.52521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.54446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.54463: stderr chunk (state=3): >>><<< 13355 1727096169.54474: stdout chunk (state=3): >>><<< 13355 1727096169.54498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096169.54574: handler run complete 13355 1727096169.54577: attempt loop complete, returning result 13355 1727096169.54580: _execute() done 13355 1727096169.54582: dumping result to json 13355 1727096169.54584: done dumping result, returning 13355 1727096169.54586: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-c514-593f-0000000005e5] 13355 1727096169.54588: sending task result for task 0afff68d-5257-c514-593f-0000000005e5 13355 1727096169.54785: done sending task result for task 0afff68d-5257-c514-593f-0000000005e5 13355 1727096169.54789: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 13355 1727096169.54851: no more pending results, returning what we have 13355 1727096169.54854: results queue empty 13355 1727096169.54855: checking for any_errors_fatal 13355 1727096169.54862: done checking for any_errors_fatal 13355 1727096169.54863: checking for max_fail_percentage 13355 1727096169.54865: done checking for max_fail_percentage 13355 1727096169.54865: checking to see if all hosts have failed and the running result is not ok 13355 1727096169.54866: done checking to see if all hosts have failed 13355 1727096169.54869: getting the remaining hosts for this loop 13355 1727096169.54871: done getting the remaining hosts for this loop 13355 1727096169.54875: getting the next task for host managed_node3 13355 1727096169.54883: done getting next task for host managed_node3 13355 1727096169.54885: ^ task is: TASK: Set NM profile exist flag based on the profile files 13355 1727096169.54890: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096169.54895: getting variables 13355 1727096169.54896: in VariableManager get_vars() 13355 1727096169.54953: Calling all_inventory to load vars for managed_node3 13355 1727096169.54956: Calling groups_inventory to load vars for managed_node3 13355 1727096169.54958: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096169.55174: Calling all_plugins_play to load vars for managed_node3 13355 1727096169.55179: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096169.55183: Calling groups_plugins_play to load vars for managed_node3 13355 1727096169.56006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096169.56883: done with get_vars() 13355 1727096169.56908: done getting variables 13355 1727096169.56953: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:56:09 -0400 (0:00:00.389) 0:00:18.830 ****** 13355 1727096169.56985: entering _queue_task() for managed_node3/set_fact 13355 1727096169.57255: worker is 1 (out of 1 available) 13355 1727096169.57272: exiting _queue_task() for managed_node3/set_fact 13355 1727096169.57288: done queuing things up, now waiting for results queue to drain 13355 1727096169.57290: waiting for pending results... 13355 1727096169.57482: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 13355 1727096169.57595: in run() - task 0afff68d-5257-c514-593f-0000000005e6 13355 1727096169.57604: variable 'ansible_search_path' from source: unknown 13355 1727096169.57607: variable 'ansible_search_path' from source: unknown 13355 1727096169.57632: calling self._execute() 13355 1727096169.57713: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.57716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.57725: variable 'omit' from source: magic vars 13355 1727096169.58014: variable 'ansible_distribution_major_version' from source: facts 13355 1727096169.58027: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096169.58110: variable 'profile_stat' from source: set_fact 13355 1727096169.58122: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096169.58125: when evaluation is False, skipping this task 13355 1727096169.58129: _execute() done 13355 1727096169.58132: dumping result to json 13355 1727096169.58135: done dumping result, returning 13355 1727096169.58148: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-c514-593f-0000000005e6] 13355 1727096169.58150: sending task result for task 0afff68d-5257-c514-593f-0000000005e6 13355 1727096169.58228: done sending task result for task 0afff68d-5257-c514-593f-0000000005e6 13355 1727096169.58231: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096169.58300: no more pending results, returning what we have 13355 1727096169.58304: results queue empty 13355 1727096169.58305: checking for any_errors_fatal 13355 1727096169.58313: done checking for any_errors_fatal 13355 1727096169.58313: checking for max_fail_percentage 13355 1727096169.58315: done checking for max_fail_percentage 13355 1727096169.58321: checking to see if all hosts have failed and the running result is not ok 13355 1727096169.58322: done checking to see if all hosts have failed 13355 1727096169.58323: getting the remaining hosts for this loop 13355 1727096169.58324: done getting the remaining hosts for this loop 13355 1727096169.58328: getting the next task for host managed_node3 13355 1727096169.58334: done getting next task for host managed_node3 13355 1727096169.58337: ^ task is: TASK: Get NM profile info 13355 1727096169.58341: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096169.58345: getting variables 13355 1727096169.58346: in VariableManager get_vars() 13355 1727096169.58399: Calling all_inventory to load vars for managed_node3 13355 1727096169.58401: Calling groups_inventory to load vars for managed_node3 13355 1727096169.58403: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096169.58417: Calling all_plugins_play to load vars for managed_node3 13355 1727096169.58420: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096169.58422: Calling groups_plugins_play to load vars for managed_node3 13355 1727096169.59551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096169.60408: done with get_vars() 13355 1727096169.60425: done getting variables 13355 1727096169.60473: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:56:09 -0400 (0:00:00.035) 0:00:18.865 ****** 13355 1727096169.60495: entering _queue_task() for managed_node3/shell 13355 1727096169.60751: worker is 1 (out of 1 available) 13355 1727096169.60766: exiting _queue_task() for managed_node3/shell 13355 1727096169.60780: done queuing things up, now waiting for results queue to drain 13355 1727096169.60783: waiting for pending results... 13355 1727096169.60947: running TaskExecutor() for managed_node3/TASK: Get NM profile info 13355 1727096169.61027: in run() - task 0afff68d-5257-c514-593f-0000000005e7 13355 1727096169.61038: variable 'ansible_search_path' from source: unknown 13355 1727096169.61042: variable 'ansible_search_path' from source: unknown 13355 1727096169.61072: calling self._execute() 13355 1727096169.61147: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.61151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.61161: variable 'omit' from source: magic vars 13355 1727096169.61433: variable 'ansible_distribution_major_version' from source: facts 13355 1727096169.61449: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096169.61452: variable 'omit' from source: magic vars 13355 1727096169.61484: variable 'omit' from source: magic vars 13355 1727096169.61562: variable 'profile' from source: include params 13355 1727096169.61565: variable 'item' from source: include params 13355 1727096169.61606: variable 'item' from source: include params 13355 1727096169.61621: variable 'omit' from source: magic vars 13355 1727096169.61658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096169.61687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096169.61706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096169.61721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096169.61730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096169.61757: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096169.61761: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.61763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.61833: Set connection var ansible_shell_executable to /bin/sh 13355 1727096169.61838: Set connection var ansible_shell_type to sh 13355 1727096169.61843: Set connection var ansible_pipelining to False 13355 1727096169.61848: Set connection var ansible_connection to ssh 13355 1727096169.61856: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096169.61859: Set connection var ansible_timeout to 10 13355 1727096169.61884: variable 'ansible_shell_executable' from source: unknown 13355 1727096169.61887: variable 'ansible_connection' from source: unknown 13355 1727096169.61890: variable 'ansible_module_compression' from source: unknown 13355 1727096169.61892: variable 'ansible_shell_type' from source: unknown 13355 1727096169.61894: variable 'ansible_shell_executable' from source: unknown 13355 1727096169.61896: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096169.61899: variable 'ansible_pipelining' from source: unknown 13355 1727096169.61901: variable 'ansible_timeout' from source: unknown 13355 1727096169.61903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096169.62002: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096169.62011: variable 'omit' from source: magic vars 13355 1727096169.62015: starting attempt loop 13355 1727096169.62018: running the handler 13355 1727096169.62029: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096169.62044: _low_level_execute_command(): starting 13355 1727096169.62051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096169.62578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096169.62584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.62587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096169.62590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.62640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.62644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.62648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.62689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.64348: stdout chunk (state=3): >>>/root <<< 13355 1727096169.64444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.64478: stderr chunk (state=3): >>><<< 13355 1727096169.64482: stdout chunk (state=3): >>><<< 13355 1727096169.64505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096169.64519: _low_level_execute_command(): starting 13355 1727096169.64527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902 `" && echo ansible-tmp-1727096169.6450512-14240-227064708606902="` echo /root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902 `" ) && sleep 0' 13355 1727096169.64987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.64991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.64994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.64996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.65048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.65051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.65058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.65094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.67069: stdout chunk (state=3): >>>ansible-tmp-1727096169.6450512-14240-227064708606902=/root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902 <<< 13355 1727096169.67165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.67202: stderr chunk (state=3): >>><<< 13355 1727096169.67205: stdout chunk (state=3): >>><<< 13355 1727096169.67221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096169.6450512-14240-227064708606902=/root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096169.67252: variable 'ansible_module_compression' from source: unknown 13355 1727096169.67301: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096169.67333: variable 'ansible_facts' from source: unknown 13355 1727096169.67395: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/AnsiballZ_command.py 13355 1727096169.67497: Sending initial data 13355 1727096169.67500: Sent initial data (156 bytes) 13355 1727096169.67963: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.67967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096169.67971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.67973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096169.67975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.68026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.68029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.68035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.68072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.69693: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096169.69722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096169.69754: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpiz_z3wry /root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/AnsiballZ_command.py <<< 13355 1727096169.69766: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/AnsiballZ_command.py" <<< 13355 1727096169.69790: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpiz_z3wry" to remote "/root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/AnsiballZ_command.py" <<< 13355 1727096169.69792: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/AnsiballZ_command.py" <<< 13355 1727096169.70279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.70322: stderr chunk (state=3): >>><<< 13355 1727096169.70325: stdout chunk (state=3): >>><<< 13355 1727096169.70374: done transferring module to remote 13355 1727096169.70383: _low_level_execute_command(): starting 13355 1727096169.70388: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/ /root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/AnsiballZ_command.py && sleep 0' 13355 1727096169.70842: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.70849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.70852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.70854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096169.70856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.70906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.70913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.70916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.70945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.72750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.72776: stderr chunk (state=3): >>><<< 13355 1727096169.72782: stdout chunk (state=3): >>><<< 13355 1727096169.72796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096169.72799: _low_level_execute_command(): starting 13355 1727096169.72804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/AnsiballZ_command.py && sleep 0' 13355 1727096169.73257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.73260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096169.73263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.73265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.73269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.73324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.73328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.73332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.73373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.90937: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-23 08:56:09.885741", "end": "2024-09-23 08:56:09.906361", "delta": "0:00:00.020620", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096169.92892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096169.92897: stdout chunk (state=3): >>><<< 13355 1727096169.92899: stderr chunk (state=3): >>><<< 13355 1727096169.93045: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-23 08:56:09.885741", "end": "2024-09-23 08:56:09.906361", "delta": "0:00:00.020620", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096169.93049: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096169.93057: _low_level_execute_command(): starting 13355 1727096169.93060: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096169.6450512-14240-227064708606902/ > /dev/null 2>&1 && sleep 0' 13355 1727096169.93811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096169.93824: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096169.93840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096169.93863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096169.93883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096169.93895: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096169.93907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.93925: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096169.93983: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096169.94023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096169.94046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096169.94066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096169.94135: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096169.96071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096169.96082: stdout chunk (state=3): >>><<< 13355 1727096169.96095: stderr chunk (state=3): >>><<< 13355 1727096169.96115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096169.96127: handler run complete 13355 1727096169.96163: Evaluated conditional (False): False 13355 1727096169.96230: attempt loop complete, returning result 13355 1727096169.96238: _execute() done 13355 1727096169.96245: dumping result to json 13355 1727096169.96255: done dumping result, returning 13355 1727096169.96306: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-c514-593f-0000000005e7] 13355 1727096169.96309: sending task result for task 0afff68d-5257-c514-593f-0000000005e7 13355 1727096169.96754: done sending task result for task 0afff68d-5257-c514-593f-0000000005e7 13355 1727096169.96758: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.020620", "end": "2024-09-23 08:56:09.906361", "rc": 0, "start": "2024-09-23 08:56:09.885741" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 13355 1727096169.97042: no more pending results, returning what we have 13355 1727096169.97046: results queue empty 13355 1727096169.97047: checking for any_errors_fatal 13355 1727096169.97053: done checking for any_errors_fatal 13355 1727096169.97054: checking for max_fail_percentage 13355 1727096169.97056: done checking for max_fail_percentage 13355 1727096169.97057: checking to see if all hosts have failed and the running result is not ok 13355 1727096169.97057: done checking to see if all hosts have failed 13355 1727096169.97058: getting the remaining hosts for this loop 13355 1727096169.97059: done getting the remaining hosts for this loop 13355 1727096169.97063: getting the next task for host managed_node3 13355 1727096169.97072: done getting next task for host managed_node3 13355 1727096169.97075: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13355 1727096169.97084: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096169.97088: getting variables 13355 1727096169.97089: in VariableManager get_vars() 13355 1727096169.97147: Calling all_inventory to load vars for managed_node3 13355 1727096169.97150: Calling groups_inventory to load vars for managed_node3 13355 1727096169.97153: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096169.97166: Calling all_plugins_play to load vars for managed_node3 13355 1727096169.97574: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096169.97579: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.01889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.05755: done with get_vars() 13355 1727096170.05784: done getting variables 13355 1727096170.05960: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:56:10 -0400 (0:00:00.454) 0:00:19.320 ****** 13355 1727096170.05995: entering _queue_task() for managed_node3/set_fact 13355 1727096170.06766: worker is 1 (out of 1 available) 13355 1727096170.06781: exiting _queue_task() for managed_node3/set_fact 13355 1727096170.06793: done queuing things up, now waiting for results queue to drain 13355 1727096170.06794: waiting for pending results... 13355 1727096170.07440: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13355 1727096170.07874: in run() - task 0afff68d-5257-c514-593f-0000000005e8 13355 1727096170.07894: variable 'ansible_search_path' from source: unknown 13355 1727096170.07904: variable 'ansible_search_path' from source: unknown 13355 1727096170.07947: calling self._execute() 13355 1727096170.08415: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.08674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.08678: variable 'omit' from source: magic vars 13355 1727096170.09005: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.09188: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.09427: variable 'nm_profile_exists' from source: set_fact 13355 1727096170.09448: Evaluated conditional (nm_profile_exists.rc == 0): True 13355 1727096170.09460: variable 'omit' from source: magic vars 13355 1727096170.09509: variable 'omit' from source: magic vars 13355 1727096170.09607: variable 'omit' from source: magic vars 13355 1727096170.10073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096170.10076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096170.10079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096170.10080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.10082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.10084: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096170.10085: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.10087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.10284: Set connection var ansible_shell_executable to /bin/sh 13355 1727096170.10383: Set connection var ansible_shell_type to sh 13355 1727096170.10394: Set connection var ansible_pipelining to False 13355 1727096170.10402: Set connection var ansible_connection to ssh 13355 1727096170.10411: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096170.10426: Set connection var ansible_timeout to 10 13355 1727096170.10462: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.10679: variable 'ansible_connection' from source: unknown 13355 1727096170.10872: variable 'ansible_module_compression' from source: unknown 13355 1727096170.10875: variable 'ansible_shell_type' from source: unknown 13355 1727096170.10878: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.10880: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.10882: variable 'ansible_pipelining' from source: unknown 13355 1727096170.10885: variable 'ansible_timeout' from source: unknown 13355 1727096170.10887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.10898: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096170.10918: variable 'omit' from source: magic vars 13355 1727096170.10961: starting attempt loop 13355 1727096170.11008: running the handler 13355 1727096170.11026: handler run complete 13355 1727096170.11041: attempt loop complete, returning result 13355 1727096170.11117: _execute() done 13355 1727096170.11125: dumping result to json 13355 1727096170.11132: done dumping result, returning 13355 1727096170.11145: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-c514-593f-0000000005e8] 13355 1727096170.11158: sending task result for task 0afff68d-5257-c514-593f-0000000005e8 13355 1727096170.11268: done sending task result for task 0afff68d-5257-c514-593f-0000000005e8 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13355 1727096170.11348: no more pending results, returning what we have 13355 1727096170.11351: results queue empty 13355 1727096170.11352: checking for any_errors_fatal 13355 1727096170.11363: done checking for any_errors_fatal 13355 1727096170.11364: checking for max_fail_percentage 13355 1727096170.11366: done checking for max_fail_percentage 13355 1727096170.11366: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.11368: done checking to see if all hosts have failed 13355 1727096170.11369: getting the remaining hosts for this loop 13355 1727096170.11371: done getting the remaining hosts for this loop 13355 1727096170.11375: getting the next task for host managed_node3 13355 1727096170.11385: done getting next task for host managed_node3 13355 1727096170.11388: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13355 1727096170.11392: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.11397: getting variables 13355 1727096170.11398: in VariableManager get_vars() 13355 1727096170.11505: Calling all_inventory to load vars for managed_node3 13355 1727096170.11508: Calling groups_inventory to load vars for managed_node3 13355 1727096170.11511: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.11524: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.11526: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.11530: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.12225: WORKER PROCESS EXITING 13355 1727096170.14344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.17746: done with get_vars() 13355 1727096170.17787: done getting variables 13355 1727096170.17853: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096170.18201: variable 'profile' from source: include params 13355 1727096170.18205: variable 'item' from source: include params 13355 1727096170.18501: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:56:10 -0400 (0:00:00.125) 0:00:19.446 ****** 13355 1727096170.18541: entering _queue_task() for managed_node3/command 13355 1727096170.19349: worker is 1 (out of 1 available) 13355 1727096170.19362: exiting _queue_task() for managed_node3/command 13355 1727096170.19377: done queuing things up, now waiting for results queue to drain 13355 1727096170.19379: waiting for pending results... 13355 1727096170.19888: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 13355 1727096170.19973: in run() - task 0afff68d-5257-c514-593f-0000000005ea 13355 1727096170.20202: variable 'ansible_search_path' from source: unknown 13355 1727096170.20207: variable 'ansible_search_path' from source: unknown 13355 1727096170.20374: calling self._execute() 13355 1727096170.20485: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.20489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.20492: variable 'omit' from source: magic vars 13355 1727096170.21362: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.21438: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.21657: variable 'profile_stat' from source: set_fact 13355 1727096170.21684: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096170.21696: when evaluation is False, skipping this task 13355 1727096170.21703: _execute() done 13355 1727096170.21710: dumping result to json 13355 1727096170.21718: done dumping result, returning 13355 1727096170.21729: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [0afff68d-5257-c514-593f-0000000005ea] 13355 1727096170.21738: sending task result for task 0afff68d-5257-c514-593f-0000000005ea skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096170.21898: no more pending results, returning what we have 13355 1727096170.21903: results queue empty 13355 1727096170.21904: checking for any_errors_fatal 13355 1727096170.21910: done checking for any_errors_fatal 13355 1727096170.21910: checking for max_fail_percentage 13355 1727096170.21912: done checking for max_fail_percentage 13355 1727096170.21913: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.21913: done checking to see if all hosts have failed 13355 1727096170.21914: getting the remaining hosts for this loop 13355 1727096170.21915: done getting the remaining hosts for this loop 13355 1727096170.21919: getting the next task for host managed_node3 13355 1727096170.21926: done getting next task for host managed_node3 13355 1727096170.21928: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13355 1727096170.21932: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.21936: getting variables 13355 1727096170.21937: in VariableManager get_vars() 13355 1727096170.22097: Calling all_inventory to load vars for managed_node3 13355 1727096170.22100: Calling groups_inventory to load vars for managed_node3 13355 1727096170.22102: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.22114: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.22116: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.22120: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.22637: done sending task result for task 0afff68d-5257-c514-593f-0000000005ea 13355 1727096170.22641: WORKER PROCESS EXITING 13355 1727096170.23758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.26034: done with get_vars() 13355 1727096170.26075: done getting variables 13355 1727096170.26139: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096170.26264: variable 'profile' from source: include params 13355 1727096170.26269: variable 'item' from source: include params 13355 1727096170.26330: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:56:10 -0400 (0:00:00.078) 0:00:19.524 ****** 13355 1727096170.26368: entering _queue_task() for managed_node3/set_fact 13355 1727096170.26745: worker is 1 (out of 1 available) 13355 1727096170.26760: exiting _queue_task() for managed_node3/set_fact 13355 1727096170.26878: done queuing things up, now waiting for results queue to drain 13355 1727096170.26880: waiting for pending results... 13355 1727096170.27065: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 13355 1727096170.27198: in run() - task 0afff68d-5257-c514-593f-0000000005eb 13355 1727096170.27223: variable 'ansible_search_path' from source: unknown 13355 1727096170.27231: variable 'ansible_search_path' from source: unknown 13355 1727096170.27277: calling self._execute() 13355 1727096170.27557: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.27563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.27569: variable 'omit' from source: magic vars 13355 1727096170.27856: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.27877: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.28005: variable 'profile_stat' from source: set_fact 13355 1727096170.28024: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096170.28032: when evaluation is False, skipping this task 13355 1727096170.28038: _execute() done 13355 1727096170.28060: dumping result to json 13355 1727096170.28063: done dumping result, returning 13355 1727096170.28073: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0afff68d-5257-c514-593f-0000000005eb] 13355 1727096170.28172: sending task result for task 0afff68d-5257-c514-593f-0000000005eb 13355 1727096170.28243: done sending task result for task 0afff68d-5257-c514-593f-0000000005eb 13355 1727096170.28246: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096170.28320: no more pending results, returning what we have 13355 1727096170.28325: results queue empty 13355 1727096170.28326: checking for any_errors_fatal 13355 1727096170.28333: done checking for any_errors_fatal 13355 1727096170.28333: checking for max_fail_percentage 13355 1727096170.28336: done checking for max_fail_percentage 13355 1727096170.28336: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.28337: done checking to see if all hosts have failed 13355 1727096170.28338: getting the remaining hosts for this loop 13355 1727096170.28339: done getting the remaining hosts for this loop 13355 1727096170.28344: getting the next task for host managed_node3 13355 1727096170.28355: done getting next task for host managed_node3 13355 1727096170.28358: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13355 1727096170.28364: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.28371: getting variables 13355 1727096170.28373: in VariableManager get_vars() 13355 1727096170.28435: Calling all_inventory to load vars for managed_node3 13355 1727096170.28439: Calling groups_inventory to load vars for managed_node3 13355 1727096170.28441: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.28458: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.28463: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.28467: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.30083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.31659: done with get_vars() 13355 1727096170.31694: done getting variables 13355 1727096170.31759: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096170.31879: variable 'profile' from source: include params 13355 1727096170.31883: variable 'item' from source: include params 13355 1727096170.31941: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:56:10 -0400 (0:00:00.056) 0:00:19.580 ****** 13355 1727096170.31978: entering _queue_task() for managed_node3/command 13355 1727096170.32333: worker is 1 (out of 1 available) 13355 1727096170.32346: exiting _queue_task() for managed_node3/command 13355 1727096170.32362: done queuing things up, now waiting for results queue to drain 13355 1727096170.32363: waiting for pending results... 13355 1727096170.32784: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 13355 1727096170.32790: in run() - task 0afff68d-5257-c514-593f-0000000005ec 13355 1727096170.32793: variable 'ansible_search_path' from source: unknown 13355 1727096170.32796: variable 'ansible_search_path' from source: unknown 13355 1727096170.32823: calling self._execute() 13355 1727096170.32924: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.32936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.32951: variable 'omit' from source: magic vars 13355 1727096170.33312: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.33329: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.33464: variable 'profile_stat' from source: set_fact 13355 1727096170.33484: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096170.33492: when evaluation is False, skipping this task 13355 1727096170.33498: _execute() done 13355 1727096170.33505: dumping result to json 13355 1727096170.33511: done dumping result, returning 13355 1727096170.33520: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [0afff68d-5257-c514-593f-0000000005ec] 13355 1727096170.33529: sending task result for task 0afff68d-5257-c514-593f-0000000005ec skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096170.33720: no more pending results, returning what we have 13355 1727096170.33725: results queue empty 13355 1727096170.33726: checking for any_errors_fatal 13355 1727096170.33731: done checking for any_errors_fatal 13355 1727096170.33732: checking for max_fail_percentage 13355 1727096170.33734: done checking for max_fail_percentage 13355 1727096170.33735: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.33735: done checking to see if all hosts have failed 13355 1727096170.33736: getting the remaining hosts for this loop 13355 1727096170.33737: done getting the remaining hosts for this loop 13355 1727096170.33741: getting the next task for host managed_node3 13355 1727096170.33749: done getting next task for host managed_node3 13355 1727096170.33755: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13355 1727096170.33759: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.33765: getting variables 13355 1727096170.33766: in VariableManager get_vars() 13355 1727096170.33825: Calling all_inventory to load vars for managed_node3 13355 1727096170.33829: Calling groups_inventory to load vars for managed_node3 13355 1727096170.33832: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.33847: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.33850: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.33855: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.34489: done sending task result for task 0afff68d-5257-c514-593f-0000000005ec 13355 1727096170.34493: WORKER PROCESS EXITING 13355 1727096170.35583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.38098: done with get_vars() 13355 1727096170.38130: done getting variables 13355 1727096170.38402: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096170.38517: variable 'profile' from source: include params 13355 1727096170.38521: variable 'item' from source: include params 13355 1727096170.38786: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:56:10 -0400 (0:00:00.068) 0:00:19.648 ****** 13355 1727096170.38819: entering _queue_task() for managed_node3/set_fact 13355 1727096170.39589: worker is 1 (out of 1 available) 13355 1727096170.39600: exiting _queue_task() for managed_node3/set_fact 13355 1727096170.39613: done queuing things up, now waiting for results queue to drain 13355 1727096170.39614: waiting for pending results... 13355 1727096170.40287: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 13355 1727096170.40292: in run() - task 0afff68d-5257-c514-593f-0000000005ed 13355 1727096170.40296: variable 'ansible_search_path' from source: unknown 13355 1727096170.40895: variable 'ansible_search_path' from source: unknown 13355 1727096170.40902: calling self._execute() 13355 1727096170.41134: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.41146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.41232: variable 'omit' from source: magic vars 13355 1727096170.42269: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.42573: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.42778: variable 'profile_stat' from source: set_fact 13355 1727096170.42861: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096170.42873: when evaluation is False, skipping this task 13355 1727096170.42880: _execute() done 13355 1727096170.42887: dumping result to json 13355 1727096170.42895: done dumping result, returning 13355 1727096170.42907: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [0afff68d-5257-c514-593f-0000000005ed] 13355 1727096170.42917: sending task result for task 0afff68d-5257-c514-593f-0000000005ed 13355 1727096170.43069: done sending task result for task 0afff68d-5257-c514-593f-0000000005ed skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096170.43123: no more pending results, returning what we have 13355 1727096170.43128: results queue empty 13355 1727096170.43129: checking for any_errors_fatal 13355 1727096170.43137: done checking for any_errors_fatal 13355 1727096170.43138: checking for max_fail_percentage 13355 1727096170.43139: done checking for max_fail_percentage 13355 1727096170.43140: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.43141: done checking to see if all hosts have failed 13355 1727096170.43142: getting the remaining hosts for this loop 13355 1727096170.43143: done getting the remaining hosts for this loop 13355 1727096170.43147: getting the next task for host managed_node3 13355 1727096170.43156: done getting next task for host managed_node3 13355 1727096170.43158: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13355 1727096170.43161: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.43174: getting variables 13355 1727096170.43175: in VariableManager get_vars() 13355 1727096170.43230: Calling all_inventory to load vars for managed_node3 13355 1727096170.43233: Calling groups_inventory to load vars for managed_node3 13355 1727096170.43235: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.43247: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.43250: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.43252: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.44278: WORKER PROCESS EXITING 13355 1727096170.47980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.51816: done with get_vars() 13355 1727096170.51854: done getting variables 13355 1727096170.51972: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096170.52099: variable 'profile' from source: include params 13355 1727096170.52107: variable 'item' from source: include params 13355 1727096170.52166: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:56:10 -0400 (0:00:00.133) 0:00:19.782 ****** 13355 1727096170.52200: entering _queue_task() for managed_node3/assert 13355 1727096170.52676: worker is 1 (out of 1 available) 13355 1727096170.52687: exiting _queue_task() for managed_node3/assert 13355 1727096170.52698: done queuing things up, now waiting for results queue to drain 13355 1727096170.52699: waiting for pending results... 13355 1727096170.52883: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 13355 1727096170.52999: in run() - task 0afff68d-5257-c514-593f-000000000356 13355 1727096170.53021: variable 'ansible_search_path' from source: unknown 13355 1727096170.53033: variable 'ansible_search_path' from source: unknown 13355 1727096170.53074: calling self._execute() 13355 1727096170.53184: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.53200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.53214: variable 'omit' from source: magic vars 13355 1727096170.53594: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.53612: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.53628: variable 'omit' from source: magic vars 13355 1727096170.53735: variable 'omit' from source: magic vars 13355 1727096170.53799: variable 'profile' from source: include params 13355 1727096170.53809: variable 'item' from source: include params 13355 1727096170.53880: variable 'item' from source: include params 13355 1727096170.53909: variable 'omit' from source: magic vars 13355 1727096170.53975: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096170.54031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096170.54058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096170.54084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.54101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.54138: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096170.54170: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.54174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.54276: Set connection var ansible_shell_executable to /bin/sh 13355 1727096170.54280: Set connection var ansible_shell_type to sh 13355 1727096170.54291: Set connection var ansible_pipelining to False 13355 1727096170.54302: Set connection var ansible_connection to ssh 13355 1727096170.54313: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096170.54324: Set connection var ansible_timeout to 10 13355 1727096170.54385: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.54388: variable 'ansible_connection' from source: unknown 13355 1727096170.54391: variable 'ansible_module_compression' from source: unknown 13355 1727096170.54393: variable 'ansible_shell_type' from source: unknown 13355 1727096170.54394: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.54396: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.54398: variable 'ansible_pipelining' from source: unknown 13355 1727096170.54400: variable 'ansible_timeout' from source: unknown 13355 1727096170.54402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.54559: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096170.54615: variable 'omit' from source: magic vars 13355 1727096170.54618: starting attempt loop 13355 1727096170.54620: running the handler 13355 1727096170.54747: variable 'lsr_net_profile_exists' from source: set_fact 13355 1727096170.54821: Evaluated conditional (lsr_net_profile_exists): True 13355 1727096170.54824: handler run complete 13355 1727096170.54826: attempt loop complete, returning result 13355 1727096170.54828: _execute() done 13355 1727096170.54830: dumping result to json 13355 1727096170.54833: done dumping result, returning 13355 1727096170.54835: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [0afff68d-5257-c514-593f-000000000356] 13355 1727096170.54837: sending task result for task 0afff68d-5257-c514-593f-000000000356 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096170.54978: no more pending results, returning what we have 13355 1727096170.54982: results queue empty 13355 1727096170.54983: checking for any_errors_fatal 13355 1727096170.54989: done checking for any_errors_fatal 13355 1727096170.54990: checking for max_fail_percentage 13355 1727096170.54992: done checking for max_fail_percentage 13355 1727096170.54993: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.54993: done checking to see if all hosts have failed 13355 1727096170.54994: getting the remaining hosts for this loop 13355 1727096170.54996: done getting the remaining hosts for this loop 13355 1727096170.54999: getting the next task for host managed_node3 13355 1727096170.55007: done getting next task for host managed_node3 13355 1727096170.55009: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13355 1727096170.55012: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.55018: getting variables 13355 1727096170.55020: in VariableManager get_vars() 13355 1727096170.55081: Calling all_inventory to load vars for managed_node3 13355 1727096170.55084: Calling groups_inventory to load vars for managed_node3 13355 1727096170.55087: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.55099: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.55102: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.55105: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.56304: done sending task result for task 0afff68d-5257-c514-593f-000000000356 13355 1727096170.56308: WORKER PROCESS EXITING 13355 1727096170.57308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.59697: done with get_vars() 13355 1727096170.59776: done getting variables 13355 1727096170.59956: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096170.60122: variable 'profile' from source: include params 13355 1727096170.60126: variable 'item' from source: include params 13355 1727096170.60227: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:56:10 -0400 (0:00:00.081) 0:00:19.864 ****** 13355 1727096170.60406: entering _queue_task() for managed_node3/assert 13355 1727096170.61141: worker is 1 (out of 1 available) 13355 1727096170.61154: exiting _queue_task() for managed_node3/assert 13355 1727096170.61165: done queuing things up, now waiting for results queue to drain 13355 1727096170.61166: waiting for pending results... 13355 1727096170.61461: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 13355 1727096170.61656: in run() - task 0afff68d-5257-c514-593f-000000000357 13355 1727096170.61708: variable 'ansible_search_path' from source: unknown 13355 1727096170.61712: variable 'ansible_search_path' from source: unknown 13355 1727096170.61741: calling self._execute() 13355 1727096170.61846: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.61872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.61881: variable 'omit' from source: magic vars 13355 1727096170.62366: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.62373: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.62376: variable 'omit' from source: magic vars 13355 1727096170.62407: variable 'omit' from source: magic vars 13355 1727096170.62541: variable 'profile' from source: include params 13355 1727096170.62630: variable 'item' from source: include params 13355 1727096170.62634: variable 'item' from source: include params 13355 1727096170.62674: variable 'omit' from source: magic vars 13355 1727096170.62740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096170.62788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096170.62812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096170.62834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.62859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.62894: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096170.62902: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.62956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.63041: Set connection var ansible_shell_executable to /bin/sh 13355 1727096170.63054: Set connection var ansible_shell_type to sh 13355 1727096170.63261: Set connection var ansible_pipelining to False 13355 1727096170.63283: Set connection var ansible_connection to ssh 13355 1727096170.63286: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096170.63289: Set connection var ansible_timeout to 10 13355 1727096170.63291: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.63293: variable 'ansible_connection' from source: unknown 13355 1727096170.63295: variable 'ansible_module_compression' from source: unknown 13355 1727096170.63297: variable 'ansible_shell_type' from source: unknown 13355 1727096170.63299: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.63301: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.63302: variable 'ansible_pipelining' from source: unknown 13355 1727096170.63305: variable 'ansible_timeout' from source: unknown 13355 1727096170.63307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.63609: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096170.63628: variable 'omit' from source: magic vars 13355 1727096170.63875: starting attempt loop 13355 1727096170.63878: running the handler 13355 1727096170.63881: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13355 1727096170.63883: Evaluated conditional (lsr_net_profile_ansible_managed): True 13355 1727096170.63885: handler run complete 13355 1727096170.63887: attempt loop complete, returning result 13355 1727096170.63889: _execute() done 13355 1727096170.63890: dumping result to json 13355 1727096170.63988: done dumping result, returning 13355 1727096170.64001: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [0afff68d-5257-c514-593f-000000000357] 13355 1727096170.64011: sending task result for task 0afff68d-5257-c514-593f-000000000357 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096170.64147: no more pending results, returning what we have 13355 1727096170.64151: results queue empty 13355 1727096170.64152: checking for any_errors_fatal 13355 1727096170.64158: done checking for any_errors_fatal 13355 1727096170.64159: checking for max_fail_percentage 13355 1727096170.64160: done checking for max_fail_percentage 13355 1727096170.64161: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.64162: done checking to see if all hosts have failed 13355 1727096170.64163: getting the remaining hosts for this loop 13355 1727096170.64164: done getting the remaining hosts for this loop 13355 1727096170.64169: getting the next task for host managed_node3 13355 1727096170.64176: done getting next task for host managed_node3 13355 1727096170.64179: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13355 1727096170.64182: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.64188: getting variables 13355 1727096170.64189: in VariableManager get_vars() 13355 1727096170.64245: Calling all_inventory to load vars for managed_node3 13355 1727096170.64248: Calling groups_inventory to load vars for managed_node3 13355 1727096170.64250: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.64262: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.64266: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.64577: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.65423: done sending task result for task 0afff68d-5257-c514-593f-000000000357 13355 1727096170.65426: WORKER PROCESS EXITING 13355 1727096170.66413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.68056: done with get_vars() 13355 1727096170.68084: done getting variables 13355 1727096170.68148: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096170.68271: variable 'profile' from source: include params 13355 1727096170.68275: variable 'item' from source: include params 13355 1727096170.68336: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:56:10 -0400 (0:00:00.079) 0:00:19.944 ****** 13355 1727096170.68402: entering _queue_task() for managed_node3/assert 13355 1727096170.68799: worker is 1 (out of 1 available) 13355 1727096170.68813: exiting _queue_task() for managed_node3/assert 13355 1727096170.68827: done queuing things up, now waiting for results queue to drain 13355 1727096170.68828: waiting for pending results... 13355 1727096170.69075: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 13355 1727096170.69173: in run() - task 0afff68d-5257-c514-593f-000000000358 13355 1727096170.69187: variable 'ansible_search_path' from source: unknown 13355 1727096170.69190: variable 'ansible_search_path' from source: unknown 13355 1727096170.69234: calling self._execute() 13355 1727096170.69337: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.69343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.69353: variable 'omit' from source: magic vars 13355 1727096170.69755: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.69778: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.69782: variable 'omit' from source: magic vars 13355 1727096170.69817: variable 'omit' from source: magic vars 13355 1727096170.69929: variable 'profile' from source: include params 13355 1727096170.69933: variable 'item' from source: include params 13355 1727096170.70004: variable 'item' from source: include params 13355 1727096170.70024: variable 'omit' from source: magic vars 13355 1727096170.70070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096170.70108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096170.70128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096170.70175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.70178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.70200: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096170.70204: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.70206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.70321: Set connection var ansible_shell_executable to /bin/sh 13355 1727096170.70391: Set connection var ansible_shell_type to sh 13355 1727096170.70394: Set connection var ansible_pipelining to False 13355 1727096170.70397: Set connection var ansible_connection to ssh 13355 1727096170.70400: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096170.70403: Set connection var ansible_timeout to 10 13355 1727096170.70405: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.70407: variable 'ansible_connection' from source: unknown 13355 1727096170.70410: variable 'ansible_module_compression' from source: unknown 13355 1727096170.70412: variable 'ansible_shell_type' from source: unknown 13355 1727096170.70414: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.70416: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.70419: variable 'ansible_pipelining' from source: unknown 13355 1727096170.70421: variable 'ansible_timeout' from source: unknown 13355 1727096170.70423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.70559: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096170.70574: variable 'omit' from source: magic vars 13355 1727096170.70580: starting attempt loop 13355 1727096170.70583: running the handler 13355 1727096170.70718: variable 'lsr_net_profile_fingerprint' from source: set_fact 13355 1727096170.70722: Evaluated conditional (lsr_net_profile_fingerprint): True 13355 1727096170.70724: handler run complete 13355 1727096170.70735: attempt loop complete, returning result 13355 1727096170.70738: _execute() done 13355 1727096170.70741: dumping result to json 13355 1727096170.70743: done dumping result, returning 13355 1727096170.70772: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [0afff68d-5257-c514-593f-000000000358] 13355 1727096170.70776: sending task result for task 0afff68d-5257-c514-593f-000000000358 13355 1727096170.70976: done sending task result for task 0afff68d-5257-c514-593f-000000000358 13355 1727096170.70979: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096170.71023: no more pending results, returning what we have 13355 1727096170.71026: results queue empty 13355 1727096170.71027: checking for any_errors_fatal 13355 1727096170.71032: done checking for any_errors_fatal 13355 1727096170.71033: checking for max_fail_percentage 13355 1727096170.71035: done checking for max_fail_percentage 13355 1727096170.71035: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.71036: done checking to see if all hosts have failed 13355 1727096170.71037: getting the remaining hosts for this loop 13355 1727096170.71038: done getting the remaining hosts for this loop 13355 1727096170.71041: getting the next task for host managed_node3 13355 1727096170.71049: done getting next task for host managed_node3 13355 1727096170.71051: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13355 1727096170.71053: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.71057: getting variables 13355 1727096170.71058: in VariableManager get_vars() 13355 1727096170.71106: Calling all_inventory to load vars for managed_node3 13355 1727096170.71108: Calling groups_inventory to load vars for managed_node3 13355 1727096170.71110: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.71119: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.71121: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.71124: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.72350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.74073: done with get_vars() 13355 1727096170.74113: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:56:10 -0400 (0:00:00.058) 0:00:20.003 ****** 13355 1727096170.74241: entering _queue_task() for managed_node3/include_tasks 13355 1727096170.74625: worker is 1 (out of 1 available) 13355 1727096170.74640: exiting _queue_task() for managed_node3/include_tasks 13355 1727096170.74653: done queuing things up, now waiting for results queue to drain 13355 1727096170.74654: waiting for pending results... 13355 1727096170.74896: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 13355 1727096170.75002: in run() - task 0afff68d-5257-c514-593f-00000000035c 13355 1727096170.75006: variable 'ansible_search_path' from source: unknown 13355 1727096170.75008: variable 'ansible_search_path' from source: unknown 13355 1727096170.75074: calling self._execute() 13355 1727096170.75144: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.75148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.75162: variable 'omit' from source: magic vars 13355 1727096170.75560: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.75650: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.75654: _execute() done 13355 1727096170.75656: dumping result to json 13355 1727096170.75657: done dumping result, returning 13355 1727096170.75659: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-c514-593f-00000000035c] 13355 1727096170.75661: sending task result for task 0afff68d-5257-c514-593f-00000000035c 13355 1727096170.75722: done sending task result for task 0afff68d-5257-c514-593f-00000000035c 13355 1727096170.75725: WORKER PROCESS EXITING 13355 1727096170.75782: no more pending results, returning what we have 13355 1727096170.75788: in VariableManager get_vars() 13355 1727096170.75846: Calling all_inventory to load vars for managed_node3 13355 1727096170.75848: Calling groups_inventory to load vars for managed_node3 13355 1727096170.75850: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.75860: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.75864: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.75866: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.77411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.78937: done with get_vars() 13355 1727096170.78963: variable 'ansible_search_path' from source: unknown 13355 1727096170.78964: variable 'ansible_search_path' from source: unknown 13355 1727096170.79005: we have included files to process 13355 1727096170.79007: generating all_blocks data 13355 1727096170.79008: done generating all_blocks data 13355 1727096170.79014: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096170.79015: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096170.79017: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096170.79896: done processing included file 13355 1727096170.79898: iterating over new_blocks loaded from include file 13355 1727096170.79901: in VariableManager get_vars() 13355 1727096170.79933: done with get_vars() 13355 1727096170.79935: filtering new block on tags 13355 1727096170.79958: done filtering new block on tags 13355 1727096170.79962: in VariableManager get_vars() 13355 1727096170.79993: done with get_vars() 13355 1727096170.79995: filtering new block on tags 13355 1727096170.80016: done filtering new block on tags 13355 1727096170.80018: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 13355 1727096170.80023: extending task lists for all hosts with included blocks 13355 1727096170.80189: done extending task lists 13355 1727096170.80190: done processing included files 13355 1727096170.80191: results queue empty 13355 1727096170.80192: checking for any_errors_fatal 13355 1727096170.80195: done checking for any_errors_fatal 13355 1727096170.80196: checking for max_fail_percentage 13355 1727096170.80197: done checking for max_fail_percentage 13355 1727096170.80198: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.80199: done checking to see if all hosts have failed 13355 1727096170.80199: getting the remaining hosts for this loop 13355 1727096170.80200: done getting the remaining hosts for this loop 13355 1727096170.80203: getting the next task for host managed_node3 13355 1727096170.80207: done getting next task for host managed_node3 13355 1727096170.80209: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13355 1727096170.80212: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.80214: getting variables 13355 1727096170.80215: in VariableManager get_vars() 13355 1727096170.80234: Calling all_inventory to load vars for managed_node3 13355 1727096170.80237: Calling groups_inventory to load vars for managed_node3 13355 1727096170.80239: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.80244: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.80247: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.80250: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.81412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.82954: done with get_vars() 13355 1727096170.82987: done getting variables 13355 1727096170.83035: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:56:10 -0400 (0:00:00.088) 0:00:20.091 ****** 13355 1727096170.83070: entering _queue_task() for managed_node3/set_fact 13355 1727096170.83426: worker is 1 (out of 1 available) 13355 1727096170.83441: exiting _queue_task() for managed_node3/set_fact 13355 1727096170.83455: done queuing things up, now waiting for results queue to drain 13355 1727096170.83456: waiting for pending results... 13355 1727096170.83681: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13355 1727096170.83874: in run() - task 0afff68d-5257-c514-593f-00000000062c 13355 1727096170.83880: variable 'ansible_search_path' from source: unknown 13355 1727096170.83883: variable 'ansible_search_path' from source: unknown 13355 1727096170.83886: calling self._execute() 13355 1727096170.83937: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.84001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.84005: variable 'omit' from source: magic vars 13355 1727096170.84346: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.84365: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.84375: variable 'omit' from source: magic vars 13355 1727096170.84422: variable 'omit' from source: magic vars 13355 1727096170.84471: variable 'omit' from source: magic vars 13355 1727096170.84516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096170.84546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096170.84579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096170.84597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.84608: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.84639: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096170.84642: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.84645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.84764: Set connection var ansible_shell_executable to /bin/sh 13355 1727096170.84769: Set connection var ansible_shell_type to sh 13355 1727096170.84778: Set connection var ansible_pipelining to False 13355 1727096170.84783: Set connection var ansible_connection to ssh 13355 1727096170.84794: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096170.84799: Set connection var ansible_timeout to 10 13355 1727096170.84825: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.84828: variable 'ansible_connection' from source: unknown 13355 1727096170.84831: variable 'ansible_module_compression' from source: unknown 13355 1727096170.84833: variable 'ansible_shell_type' from source: unknown 13355 1727096170.84836: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.84838: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.84840: variable 'ansible_pipelining' from source: unknown 13355 1727096170.84842: variable 'ansible_timeout' from source: unknown 13355 1727096170.84874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.85011: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096170.85023: variable 'omit' from source: magic vars 13355 1727096170.85091: starting attempt loop 13355 1727096170.85094: running the handler 13355 1727096170.85096: handler run complete 13355 1727096170.85099: attempt loop complete, returning result 13355 1727096170.85101: _execute() done 13355 1727096170.85103: dumping result to json 13355 1727096170.85105: done dumping result, returning 13355 1727096170.85107: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-c514-593f-00000000062c] 13355 1727096170.85109: sending task result for task 0afff68d-5257-c514-593f-00000000062c 13355 1727096170.85171: done sending task result for task 0afff68d-5257-c514-593f-00000000062c 13355 1727096170.85173: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13355 1727096170.85245: no more pending results, returning what we have 13355 1727096170.85249: results queue empty 13355 1727096170.85249: checking for any_errors_fatal 13355 1727096170.85251: done checking for any_errors_fatal 13355 1727096170.85252: checking for max_fail_percentage 13355 1727096170.85254: done checking for max_fail_percentage 13355 1727096170.85255: checking to see if all hosts have failed and the running result is not ok 13355 1727096170.85255: done checking to see if all hosts have failed 13355 1727096170.85256: getting the remaining hosts for this loop 13355 1727096170.85257: done getting the remaining hosts for this loop 13355 1727096170.85260: getting the next task for host managed_node3 13355 1727096170.85269: done getting next task for host managed_node3 13355 1727096170.85272: ^ task is: TASK: Stat profile file 13355 1727096170.85276: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096170.85280: getting variables 13355 1727096170.85283: in VariableManager get_vars() 13355 1727096170.85335: Calling all_inventory to load vars for managed_node3 13355 1727096170.85338: Calling groups_inventory to load vars for managed_node3 13355 1727096170.85340: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096170.85350: Calling all_plugins_play to load vars for managed_node3 13355 1727096170.85353: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096170.85356: Calling groups_plugins_play to load vars for managed_node3 13355 1727096170.91489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096170.92972: done with get_vars() 13355 1727096170.93000: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:56:10 -0400 (0:00:00.100) 0:00:20.191 ****** 13355 1727096170.93088: entering _queue_task() for managed_node3/stat 13355 1727096170.93695: worker is 1 (out of 1 available) 13355 1727096170.93703: exiting _queue_task() for managed_node3/stat 13355 1727096170.93713: done queuing things up, now waiting for results queue to drain 13355 1727096170.93715: waiting for pending results... 13355 1727096170.93941: running TaskExecutor() for managed_node3/TASK: Stat profile file 13355 1727096170.93965: in run() - task 0afff68d-5257-c514-593f-00000000062d 13355 1727096170.93994: variable 'ansible_search_path' from source: unknown 13355 1727096170.94036: variable 'ansible_search_path' from source: unknown 13355 1727096170.94063: calling self._execute() 13355 1727096170.94176: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.94189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.94275: variable 'omit' from source: magic vars 13355 1727096170.94654: variable 'ansible_distribution_major_version' from source: facts 13355 1727096170.94675: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096170.94699: variable 'omit' from source: magic vars 13355 1727096170.94759: variable 'omit' from source: magic vars 13355 1727096170.94882: variable 'profile' from source: include params 13355 1727096170.94926: variable 'item' from source: include params 13355 1727096170.94980: variable 'item' from source: include params 13355 1727096170.95006: variable 'omit' from source: magic vars 13355 1727096170.95065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096170.95144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096170.95148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096170.95154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.95173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096170.95208: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096170.95223: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.95231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.95373: Set connection var ansible_shell_executable to /bin/sh 13355 1727096170.95376: Set connection var ansible_shell_type to sh 13355 1727096170.95379: Set connection var ansible_pipelining to False 13355 1727096170.95396: Set connection var ansible_connection to ssh 13355 1727096170.95408: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096170.95436: Set connection var ansible_timeout to 10 13355 1727096170.95480: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.95489: variable 'ansible_connection' from source: unknown 13355 1727096170.95497: variable 'ansible_module_compression' from source: unknown 13355 1727096170.95504: variable 'ansible_shell_type' from source: unknown 13355 1727096170.95511: variable 'ansible_shell_executable' from source: unknown 13355 1727096170.95530: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096170.95533: variable 'ansible_pipelining' from source: unknown 13355 1727096170.95574: variable 'ansible_timeout' from source: unknown 13355 1727096170.95577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096170.95754: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096170.95775: variable 'omit' from source: magic vars 13355 1727096170.95792: starting attempt loop 13355 1727096170.95800: running the handler 13355 1727096170.95873: _low_level_execute_command(): starting 13355 1727096170.95876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096170.96758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096170.96831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096170.96870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096170.98560: stdout chunk (state=3): >>>/root <<< 13355 1727096170.98717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096170.98721: stdout chunk (state=3): >>><<< 13355 1727096170.98723: stderr chunk (state=3): >>><<< 13355 1727096170.98846: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096170.98850: _low_level_execute_command(): starting 13355 1727096170.98853: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904 `" && echo ansible-tmp-1727096170.9875073-14295-101396095863904="` echo /root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904 `" ) && sleep 0' 13355 1727096170.99406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096170.99419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096170.99435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096170.99453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096170.99529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096170.99533: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096170.99614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096170.99679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.01670: stdout chunk (state=3): >>>ansible-tmp-1727096170.9875073-14295-101396095863904=/root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904 <<< 13355 1727096171.01820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.01846: stdout chunk (state=3): >>><<< 13355 1727096171.01850: stderr chunk (state=3): >>><<< 13355 1727096171.02079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096170.9875073-14295-101396095863904=/root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096171.02083: variable 'ansible_module_compression' from source: unknown 13355 1727096171.02086: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13355 1727096171.02088: variable 'ansible_facts' from source: unknown 13355 1727096171.02126: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/AnsiballZ_stat.py 13355 1727096171.02296: Sending initial data 13355 1727096171.02300: Sent initial data (153 bytes) 13355 1727096171.02895: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096171.02983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.03000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.03011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096171.03019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.03083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.04711: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096171.04755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096171.04785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/AnsiballZ_stat.py" <<< 13355 1727096171.04806: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpqmgpnxx3 /root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/AnsiballZ_stat.py <<< 13355 1727096171.04862: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpqmgpnxx3" to remote "/root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/AnsiballZ_stat.py" <<< 13355 1727096171.05681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.05743: stderr chunk (state=3): >>><<< 13355 1727096171.05747: stdout chunk (state=3): >>><<< 13355 1727096171.05770: done transferring module to remote 13355 1727096171.05838: _low_level_execute_command(): starting 13355 1727096171.05842: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/ /root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/AnsiballZ_stat.py && sleep 0' 13355 1727096171.06611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.06645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.06663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096171.06700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.06772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.08641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.08680: stdout chunk (state=3): >>><<< 13355 1727096171.08683: stderr chunk (state=3): >>><<< 13355 1727096171.08785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096171.08796: _low_level_execute_command(): starting 13355 1727096171.08799: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/AnsiballZ_stat.py && sleep 0' 13355 1727096171.09377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096171.09394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096171.09420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096171.09440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096171.09458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096171.09473: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096171.09488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.09533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.09620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.09648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096171.09688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.09773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.25510: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13355 1727096171.27072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096171.27076: stdout chunk (state=3): >>><<< 13355 1727096171.27079: stderr chunk (state=3): >>><<< 13355 1727096171.27081: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096171.27084: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096171.27087: _low_level_execute_command(): starting 13355 1727096171.27088: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096170.9875073-14295-101396095863904/ > /dev/null 2>&1 && sleep 0' 13355 1727096171.27686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.27696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.27709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096171.27718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.27813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.29922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.29992: stderr chunk (state=3): >>><<< 13355 1727096171.29995: stdout chunk (state=3): >>><<< 13355 1727096171.30040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096171.30048: handler run complete 13355 1727096171.30273: attempt loop complete, returning result 13355 1727096171.30277: _execute() done 13355 1727096171.30279: dumping result to json 13355 1727096171.30282: done dumping result, returning 13355 1727096171.30284: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-c514-593f-00000000062d] 13355 1727096171.30286: sending task result for task 0afff68d-5257-c514-593f-00000000062d 13355 1727096171.30360: done sending task result for task 0afff68d-5257-c514-593f-00000000062d 13355 1727096171.30364: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 13355 1727096171.30432: no more pending results, returning what we have 13355 1727096171.30436: results queue empty 13355 1727096171.30437: checking for any_errors_fatal 13355 1727096171.30445: done checking for any_errors_fatal 13355 1727096171.30446: checking for max_fail_percentage 13355 1727096171.30448: done checking for max_fail_percentage 13355 1727096171.30449: checking to see if all hosts have failed and the running result is not ok 13355 1727096171.30449: done checking to see if all hosts have failed 13355 1727096171.30450: getting the remaining hosts for this loop 13355 1727096171.30452: done getting the remaining hosts for this loop 13355 1727096171.30458: getting the next task for host managed_node3 13355 1727096171.30465: done getting next task for host managed_node3 13355 1727096171.30470: ^ task is: TASK: Set NM profile exist flag based on the profile files 13355 1727096171.30474: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096171.30483: getting variables 13355 1727096171.30485: in VariableManager get_vars() 13355 1727096171.30542: Calling all_inventory to load vars for managed_node3 13355 1727096171.30545: Calling groups_inventory to load vars for managed_node3 13355 1727096171.30548: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096171.30562: Calling all_plugins_play to load vars for managed_node3 13355 1727096171.30566: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096171.30677: Calling groups_plugins_play to load vars for managed_node3 13355 1727096171.32276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096171.34151: done with get_vars() 13355 1727096171.34186: done getting variables 13355 1727096171.34250: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:56:11 -0400 (0:00:00.412) 0:00:20.603 ****** 13355 1727096171.34296: entering _queue_task() for managed_node3/set_fact 13355 1727096171.34692: worker is 1 (out of 1 available) 13355 1727096171.34705: exiting _queue_task() for managed_node3/set_fact 13355 1727096171.34721: done queuing things up, now waiting for results queue to drain 13355 1727096171.34723: waiting for pending results... 13355 1727096171.35100: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 13355 1727096171.35173: in run() - task 0afff68d-5257-c514-593f-00000000062e 13355 1727096171.35178: variable 'ansible_search_path' from source: unknown 13355 1727096171.35180: variable 'ansible_search_path' from source: unknown 13355 1727096171.35233: calling self._execute() 13355 1727096171.35373: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.35376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.35380: variable 'omit' from source: magic vars 13355 1727096171.35811: variable 'ansible_distribution_major_version' from source: facts 13355 1727096171.35830: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096171.36073: variable 'profile_stat' from source: set_fact 13355 1727096171.36077: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096171.36080: when evaluation is False, skipping this task 13355 1727096171.36083: _execute() done 13355 1727096171.36085: dumping result to json 13355 1727096171.36088: done dumping result, returning 13355 1727096171.36091: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-c514-593f-00000000062e] 13355 1727096171.36093: sending task result for task 0afff68d-5257-c514-593f-00000000062e 13355 1727096171.36165: done sending task result for task 0afff68d-5257-c514-593f-00000000062e skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096171.36220: no more pending results, returning what we have 13355 1727096171.36224: results queue empty 13355 1727096171.36225: checking for any_errors_fatal 13355 1727096171.36235: done checking for any_errors_fatal 13355 1727096171.36237: checking for max_fail_percentage 13355 1727096171.36239: done checking for max_fail_percentage 13355 1727096171.36240: checking to see if all hosts have failed and the running result is not ok 13355 1727096171.36241: done checking to see if all hosts have failed 13355 1727096171.36241: getting the remaining hosts for this loop 13355 1727096171.36243: done getting the remaining hosts for this loop 13355 1727096171.36247: getting the next task for host managed_node3 13355 1727096171.36257: done getting next task for host managed_node3 13355 1727096171.36260: ^ task is: TASK: Get NM profile info 13355 1727096171.36264: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096171.36271: getting variables 13355 1727096171.36273: in VariableManager get_vars() 13355 1727096171.36333: Calling all_inventory to load vars for managed_node3 13355 1727096171.36336: Calling groups_inventory to load vars for managed_node3 13355 1727096171.36339: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096171.36357: Calling all_plugins_play to load vars for managed_node3 13355 1727096171.36362: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096171.36366: Calling groups_plugins_play to load vars for managed_node3 13355 1727096171.36489: WORKER PROCESS EXITING 13355 1727096171.38151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096171.39838: done with get_vars() 13355 1727096171.39879: done getting variables 13355 1727096171.39942: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:56:11 -0400 (0:00:00.056) 0:00:20.660 ****** 13355 1727096171.39988: entering _queue_task() for managed_node3/shell 13355 1727096171.40516: worker is 1 (out of 1 available) 13355 1727096171.40527: exiting _queue_task() for managed_node3/shell 13355 1727096171.40538: done queuing things up, now waiting for results queue to drain 13355 1727096171.40539: waiting for pending results... 13355 1727096171.40707: running TaskExecutor() for managed_node3/TASK: Get NM profile info 13355 1727096171.40849: in run() - task 0afff68d-5257-c514-593f-00000000062f 13355 1727096171.40880: variable 'ansible_search_path' from source: unknown 13355 1727096171.40888: variable 'ansible_search_path' from source: unknown 13355 1727096171.40931: calling self._execute() 13355 1727096171.41049: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.41066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.41082: variable 'omit' from source: magic vars 13355 1727096171.41493: variable 'ansible_distribution_major_version' from source: facts 13355 1727096171.41511: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096171.41528: variable 'omit' from source: magic vars 13355 1727096171.41591: variable 'omit' from source: magic vars 13355 1727096171.41718: variable 'profile' from source: include params 13355 1727096171.41721: variable 'item' from source: include params 13355 1727096171.41769: variable 'item' from source: include params 13355 1727096171.41784: variable 'omit' from source: magic vars 13355 1727096171.41823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096171.41854: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096171.41874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096171.41888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096171.41898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096171.41923: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096171.41928: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.41930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.42003: Set connection var ansible_shell_executable to /bin/sh 13355 1727096171.42007: Set connection var ansible_shell_type to sh 13355 1727096171.42013: Set connection var ansible_pipelining to False 13355 1727096171.42017: Set connection var ansible_connection to ssh 13355 1727096171.42026: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096171.42028: Set connection var ansible_timeout to 10 13355 1727096171.42048: variable 'ansible_shell_executable' from source: unknown 13355 1727096171.42051: variable 'ansible_connection' from source: unknown 13355 1727096171.42054: variable 'ansible_module_compression' from source: unknown 13355 1727096171.42058: variable 'ansible_shell_type' from source: unknown 13355 1727096171.42061: variable 'ansible_shell_executable' from source: unknown 13355 1727096171.42064: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.42067: variable 'ansible_pipelining' from source: unknown 13355 1727096171.42071: variable 'ansible_timeout' from source: unknown 13355 1727096171.42076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.42178: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096171.42187: variable 'omit' from source: magic vars 13355 1727096171.42193: starting attempt loop 13355 1727096171.42196: running the handler 13355 1727096171.42205: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096171.42222: _low_level_execute_command(): starting 13355 1727096171.42229: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096171.42751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096171.42758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096171.42762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.42809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.42812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096171.42819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.42865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.44545: stdout chunk (state=3): >>>/root <<< 13355 1727096171.44640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.44670: stderr chunk (state=3): >>><<< 13355 1727096171.44674: stdout chunk (state=3): >>><<< 13355 1727096171.44698: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096171.44710: _low_level_execute_command(): starting 13355 1727096171.44716: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589 `" && echo ansible-tmp-1727096171.446976-14309-61073287570589="` echo /root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589 `" ) && sleep 0' 13355 1727096171.45145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096171.45160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096171.45187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096171.45192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.45236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.45239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.45292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.47220: stdout chunk (state=3): >>>ansible-tmp-1727096171.446976-14309-61073287570589=/root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589 <<< 13355 1727096171.47327: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.47352: stderr chunk (state=3): >>><<< 13355 1727096171.47358: stdout chunk (state=3): >>><<< 13355 1727096171.47372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096171.446976-14309-61073287570589=/root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096171.47406: variable 'ansible_module_compression' from source: unknown 13355 1727096171.47446: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096171.47480: variable 'ansible_facts' from source: unknown 13355 1727096171.47537: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/AnsiballZ_command.py 13355 1727096171.47641: Sending initial data 13355 1727096171.47644: Sent initial data (154 bytes) 13355 1727096171.48055: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096171.48073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096171.48077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.48093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.48148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.48151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096171.48154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.48195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.49810: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096171.49836: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096171.49878: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmprvfmbfvt /root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/AnsiballZ_command.py <<< 13355 1727096171.49882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/AnsiballZ_command.py" <<< 13355 1727096171.49914: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmprvfmbfvt" to remote "/root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/AnsiballZ_command.py" <<< 13355 1727096171.50676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.50680: stdout chunk (state=3): >>><<< 13355 1727096171.50684: stderr chunk (state=3): >>><<< 13355 1727096171.50692: done transferring module to remote 13355 1727096171.50705: _low_level_execute_command(): starting 13355 1727096171.50710: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/ /root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/AnsiballZ_command.py && sleep 0' 13355 1727096171.51285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.51328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.51332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.51376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.53522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.53526: stdout chunk (state=3): >>><<< 13355 1727096171.53528: stderr chunk (state=3): >>><<< 13355 1727096171.53629: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096171.53633: _low_level_execute_command(): starting 13355 1727096171.53637: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/AnsiballZ_command.py && sleep 0' 13355 1727096171.54411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096171.54420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096171.54525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.54532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096171.54556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.54618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.72446: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-23 08:56:11.700629", "end": "2024-09-23 08:56:11.721514", "delta": "0:00:00.020885", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096171.74110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096171.74136: stderr chunk (state=3): >>><<< 13355 1727096171.74139: stdout chunk (state=3): >>><<< 13355 1727096171.74161: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-23 08:56:11.700629", "end": "2024-09-23 08:56:11.721514", "delta": "0:00:00.020885", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096171.74194: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096171.74200: _low_level_execute_command(): starting 13355 1727096171.74205: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096171.446976-14309-61073287570589/ > /dev/null 2>&1 && sleep 0' 13355 1727096171.74638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096171.74673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096171.74676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096171.74679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.74687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096171.74689: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096171.74739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096171.74747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096171.74749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096171.74783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096171.76632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096171.76662: stderr chunk (state=3): >>><<< 13355 1727096171.76666: stdout chunk (state=3): >>><<< 13355 1727096171.76681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096171.76688: handler run complete 13355 1727096171.76705: Evaluated conditional (False): False 13355 1727096171.76714: attempt loop complete, returning result 13355 1727096171.76716: _execute() done 13355 1727096171.76719: dumping result to json 13355 1727096171.76724: done dumping result, returning 13355 1727096171.76731: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-c514-593f-00000000062f] 13355 1727096171.76741: sending task result for task 0afff68d-5257-c514-593f-00000000062f 13355 1727096171.76834: done sending task result for task 0afff68d-5257-c514-593f-00000000062f 13355 1727096171.76837: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020885", "end": "2024-09-23 08:56:11.721514", "rc": 0, "start": "2024-09-23 08:56:11.700629" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 13355 1727096171.76910: no more pending results, returning what we have 13355 1727096171.76913: results queue empty 13355 1727096171.76914: checking for any_errors_fatal 13355 1727096171.76921: done checking for any_errors_fatal 13355 1727096171.76922: checking for max_fail_percentage 13355 1727096171.76923: done checking for max_fail_percentage 13355 1727096171.76924: checking to see if all hosts have failed and the running result is not ok 13355 1727096171.76925: done checking to see if all hosts have failed 13355 1727096171.76925: getting the remaining hosts for this loop 13355 1727096171.76926: done getting the remaining hosts for this loop 13355 1727096171.76930: getting the next task for host managed_node3 13355 1727096171.76937: done getting next task for host managed_node3 13355 1727096171.76940: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13355 1727096171.76944: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096171.76950: getting variables 13355 1727096171.76951: in VariableManager get_vars() 13355 1727096171.77006: Calling all_inventory to load vars for managed_node3 13355 1727096171.77009: Calling groups_inventory to load vars for managed_node3 13355 1727096171.77011: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096171.77021: Calling all_plugins_play to load vars for managed_node3 13355 1727096171.77023: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096171.77026: Calling groups_plugins_play to load vars for managed_node3 13355 1727096171.77965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096171.78823: done with get_vars() 13355 1727096171.78841: done getting variables 13355 1727096171.78890: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:56:11 -0400 (0:00:00.389) 0:00:21.049 ****** 13355 1727096171.78916: entering _queue_task() for managed_node3/set_fact 13355 1727096171.79187: worker is 1 (out of 1 available) 13355 1727096171.79199: exiting _queue_task() for managed_node3/set_fact 13355 1727096171.79212: done queuing things up, now waiting for results queue to drain 13355 1727096171.79214: waiting for pending results... 13355 1727096171.79394: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13355 1727096171.79476: in run() - task 0afff68d-5257-c514-593f-000000000630 13355 1727096171.79488: variable 'ansible_search_path' from source: unknown 13355 1727096171.79491: variable 'ansible_search_path' from source: unknown 13355 1727096171.79523: calling self._execute() 13355 1727096171.79600: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.79605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.79615: variable 'omit' from source: magic vars 13355 1727096171.79895: variable 'ansible_distribution_major_version' from source: facts 13355 1727096171.79906: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096171.79998: variable 'nm_profile_exists' from source: set_fact 13355 1727096171.80011: Evaluated conditional (nm_profile_exists.rc == 0): True 13355 1727096171.80016: variable 'omit' from source: magic vars 13355 1727096171.80054: variable 'omit' from source: magic vars 13355 1727096171.80077: variable 'omit' from source: magic vars 13355 1727096171.80109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096171.80136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096171.80156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096171.80171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096171.80179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096171.80204: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096171.80207: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.80210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.80280: Set connection var ansible_shell_executable to /bin/sh 13355 1727096171.80283: Set connection var ansible_shell_type to sh 13355 1727096171.80289: Set connection var ansible_pipelining to False 13355 1727096171.80294: Set connection var ansible_connection to ssh 13355 1727096171.80299: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096171.80305: Set connection var ansible_timeout to 10 13355 1727096171.80322: variable 'ansible_shell_executable' from source: unknown 13355 1727096171.80325: variable 'ansible_connection' from source: unknown 13355 1727096171.80327: variable 'ansible_module_compression' from source: unknown 13355 1727096171.80329: variable 'ansible_shell_type' from source: unknown 13355 1727096171.80331: variable 'ansible_shell_executable' from source: unknown 13355 1727096171.80334: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.80338: variable 'ansible_pipelining' from source: unknown 13355 1727096171.80340: variable 'ansible_timeout' from source: unknown 13355 1727096171.80343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.80445: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096171.80456: variable 'omit' from source: magic vars 13355 1727096171.80459: starting attempt loop 13355 1727096171.80462: running the handler 13355 1727096171.80474: handler run complete 13355 1727096171.80482: attempt loop complete, returning result 13355 1727096171.80486: _execute() done 13355 1727096171.80489: dumping result to json 13355 1727096171.80492: done dumping result, returning 13355 1727096171.80501: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-c514-593f-000000000630] 13355 1727096171.80504: sending task result for task 0afff68d-5257-c514-593f-000000000630 13355 1727096171.80584: done sending task result for task 0afff68d-5257-c514-593f-000000000630 13355 1727096171.80587: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13355 1727096171.80680: no more pending results, returning what we have 13355 1727096171.80683: results queue empty 13355 1727096171.80684: checking for any_errors_fatal 13355 1727096171.80691: done checking for any_errors_fatal 13355 1727096171.80692: checking for max_fail_percentage 13355 1727096171.80693: done checking for max_fail_percentage 13355 1727096171.80694: checking to see if all hosts have failed and the running result is not ok 13355 1727096171.80695: done checking to see if all hosts have failed 13355 1727096171.80695: getting the remaining hosts for this loop 13355 1727096171.80697: done getting the remaining hosts for this loop 13355 1727096171.80700: getting the next task for host managed_node3 13355 1727096171.80719: done getting next task for host managed_node3 13355 1727096171.80722: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13355 1727096171.80726: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096171.80730: getting variables 13355 1727096171.80731: in VariableManager get_vars() 13355 1727096171.80778: Calling all_inventory to load vars for managed_node3 13355 1727096171.80780: Calling groups_inventory to load vars for managed_node3 13355 1727096171.80782: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096171.80791: Calling all_plugins_play to load vars for managed_node3 13355 1727096171.80794: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096171.80797: Calling groups_plugins_play to load vars for managed_node3 13355 1727096171.81560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096171.82437: done with get_vars() 13355 1727096171.82459: done getting variables 13355 1727096171.82506: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096171.82596: variable 'profile' from source: include params 13355 1727096171.82600: variable 'item' from source: include params 13355 1727096171.82640: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:56:11 -0400 (0:00:00.037) 0:00:21.087 ****** 13355 1727096171.82672: entering _queue_task() for managed_node3/command 13355 1727096171.82928: worker is 1 (out of 1 available) 13355 1727096171.82942: exiting _queue_task() for managed_node3/command 13355 1727096171.82958: done queuing things up, now waiting for results queue to drain 13355 1727096171.82959: waiting for pending results... 13355 1727096171.83131: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 13355 1727096171.83210: in run() - task 0afff68d-5257-c514-593f-000000000632 13355 1727096171.83222: variable 'ansible_search_path' from source: unknown 13355 1727096171.83225: variable 'ansible_search_path' from source: unknown 13355 1727096171.83254: calling self._execute() 13355 1727096171.83329: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.83334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.83343: variable 'omit' from source: magic vars 13355 1727096171.83619: variable 'ansible_distribution_major_version' from source: facts 13355 1727096171.83628: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096171.83709: variable 'profile_stat' from source: set_fact 13355 1727096171.83721: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096171.83724: when evaluation is False, skipping this task 13355 1727096171.83727: _execute() done 13355 1727096171.83732: dumping result to json 13355 1727096171.83734: done dumping result, returning 13355 1727096171.83737: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0afff68d-5257-c514-593f-000000000632] 13355 1727096171.83747: sending task result for task 0afff68d-5257-c514-593f-000000000632 13355 1727096171.83827: done sending task result for task 0afff68d-5257-c514-593f-000000000632 13355 1727096171.83831: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096171.83904: no more pending results, returning what we have 13355 1727096171.83908: results queue empty 13355 1727096171.83909: checking for any_errors_fatal 13355 1727096171.83915: done checking for any_errors_fatal 13355 1727096171.83916: checking for max_fail_percentage 13355 1727096171.83918: done checking for max_fail_percentage 13355 1727096171.83919: checking to see if all hosts have failed and the running result is not ok 13355 1727096171.83920: done checking to see if all hosts have failed 13355 1727096171.83920: getting the remaining hosts for this loop 13355 1727096171.83921: done getting the remaining hosts for this loop 13355 1727096171.83925: getting the next task for host managed_node3 13355 1727096171.83932: done getting next task for host managed_node3 13355 1727096171.83935: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13355 1727096171.83939: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096171.83945: getting variables 13355 1727096171.83970: in VariableManager get_vars() 13355 1727096171.84023: Calling all_inventory to load vars for managed_node3 13355 1727096171.84025: Calling groups_inventory to load vars for managed_node3 13355 1727096171.84027: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096171.84038: Calling all_plugins_play to load vars for managed_node3 13355 1727096171.84040: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096171.84043: Calling groups_plugins_play to load vars for managed_node3 13355 1727096171.85008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096171.85872: done with get_vars() 13355 1727096171.85893: done getting variables 13355 1727096171.85937: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096171.86052: variable 'profile' from source: include params 13355 1727096171.86057: variable 'item' from source: include params 13355 1727096171.86146: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:56:11 -0400 (0:00:00.035) 0:00:21.122 ****** 13355 1727096171.86207: entering _queue_task() for managed_node3/set_fact 13355 1727096171.86638: worker is 1 (out of 1 available) 13355 1727096171.86649: exiting _queue_task() for managed_node3/set_fact 13355 1727096171.86661: done queuing things up, now waiting for results queue to drain 13355 1727096171.86662: waiting for pending results... 13355 1727096171.86985: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 13355 1727096171.87112: in run() - task 0afff68d-5257-c514-593f-000000000633 13355 1727096171.87123: variable 'ansible_search_path' from source: unknown 13355 1727096171.87150: variable 'ansible_search_path' from source: unknown 13355 1727096171.87204: calling self._execute() 13355 1727096171.87333: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.87358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.87362: variable 'omit' from source: magic vars 13355 1727096171.87669: variable 'ansible_distribution_major_version' from source: facts 13355 1727096171.87682: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096171.87772: variable 'profile_stat' from source: set_fact 13355 1727096171.87785: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096171.87789: when evaluation is False, skipping this task 13355 1727096171.87791: _execute() done 13355 1727096171.87793: dumping result to json 13355 1727096171.87795: done dumping result, returning 13355 1727096171.87802: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0afff68d-5257-c514-593f-000000000633] 13355 1727096171.87806: sending task result for task 0afff68d-5257-c514-593f-000000000633 13355 1727096171.87895: done sending task result for task 0afff68d-5257-c514-593f-000000000633 13355 1727096171.87898: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096171.87969: no more pending results, returning what we have 13355 1727096171.87973: results queue empty 13355 1727096171.87974: checking for any_errors_fatal 13355 1727096171.87980: done checking for any_errors_fatal 13355 1727096171.87981: checking for max_fail_percentage 13355 1727096171.87983: done checking for max_fail_percentage 13355 1727096171.87983: checking to see if all hosts have failed and the running result is not ok 13355 1727096171.87984: done checking to see if all hosts have failed 13355 1727096171.87985: getting the remaining hosts for this loop 13355 1727096171.87986: done getting the remaining hosts for this loop 13355 1727096171.87989: getting the next task for host managed_node3 13355 1727096171.87997: done getting next task for host managed_node3 13355 1727096171.88000: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13355 1727096171.88004: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096171.88011: getting variables 13355 1727096171.88012: in VariableManager get_vars() 13355 1727096171.88062: Calling all_inventory to load vars for managed_node3 13355 1727096171.88064: Calling groups_inventory to load vars for managed_node3 13355 1727096171.88066: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096171.88078: Calling all_plugins_play to load vars for managed_node3 13355 1727096171.88080: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096171.88083: Calling groups_plugins_play to load vars for managed_node3 13355 1727096171.88887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096171.90134: done with get_vars() 13355 1727096171.90162: done getting variables 13355 1727096171.90227: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096171.90339: variable 'profile' from source: include params 13355 1727096171.90343: variable 'item' from source: include params 13355 1727096171.90397: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:56:11 -0400 (0:00:00.042) 0:00:21.164 ****** 13355 1727096171.90428: entering _queue_task() for managed_node3/command 13355 1727096171.90995: worker is 1 (out of 1 available) 13355 1727096171.91002: exiting _queue_task() for managed_node3/command 13355 1727096171.91012: done queuing things up, now waiting for results queue to drain 13355 1727096171.91014: waiting for pending results... 13355 1727096171.91142: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 13355 1727096171.91188: in run() - task 0afff68d-5257-c514-593f-000000000634 13355 1727096171.91207: variable 'ansible_search_path' from source: unknown 13355 1727096171.91214: variable 'ansible_search_path' from source: unknown 13355 1727096171.91259: calling self._execute() 13355 1727096171.91369: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.91381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.91394: variable 'omit' from source: magic vars 13355 1727096171.91763: variable 'ansible_distribution_major_version' from source: facts 13355 1727096171.91786: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096171.91915: variable 'profile_stat' from source: set_fact 13355 1727096171.91972: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096171.91975: when evaluation is False, skipping this task 13355 1727096171.91977: _execute() done 13355 1727096171.91979: dumping result to json 13355 1727096171.91981: done dumping result, returning 13355 1727096171.91984: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0afff68d-5257-c514-593f-000000000634] 13355 1727096171.91986: sending task result for task 0afff68d-5257-c514-593f-000000000634 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096171.92155: no more pending results, returning what we have 13355 1727096171.92160: results queue empty 13355 1727096171.92161: checking for any_errors_fatal 13355 1727096171.92170: done checking for any_errors_fatal 13355 1727096171.92170: checking for max_fail_percentage 13355 1727096171.92172: done checking for max_fail_percentage 13355 1727096171.92173: checking to see if all hosts have failed and the running result is not ok 13355 1727096171.92174: done checking to see if all hosts have failed 13355 1727096171.92174: getting the remaining hosts for this loop 13355 1727096171.92176: done getting the remaining hosts for this loop 13355 1727096171.92179: getting the next task for host managed_node3 13355 1727096171.92187: done getting next task for host managed_node3 13355 1727096171.92189: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13355 1727096171.92194: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096171.92198: getting variables 13355 1727096171.92199: in VariableManager get_vars() 13355 1727096171.92256: Calling all_inventory to load vars for managed_node3 13355 1727096171.92259: Calling groups_inventory to load vars for managed_node3 13355 1727096171.92261: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096171.92476: Calling all_plugins_play to load vars for managed_node3 13355 1727096171.92480: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096171.92485: Calling groups_plugins_play to load vars for managed_node3 13355 1727096171.93181: done sending task result for task 0afff68d-5257-c514-593f-000000000634 13355 1727096171.93185: WORKER PROCESS EXITING 13355 1727096171.93946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096171.95442: done with get_vars() 13355 1727096171.95473: done getting variables 13355 1727096171.95537: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096171.95649: variable 'profile' from source: include params 13355 1727096171.95653: variable 'item' from source: include params 13355 1727096171.95711: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:56:11 -0400 (0:00:00.053) 0:00:21.218 ****** 13355 1727096171.95743: entering _queue_task() for managed_node3/set_fact 13355 1727096171.96108: worker is 1 (out of 1 available) 13355 1727096171.96120: exiting _queue_task() for managed_node3/set_fact 13355 1727096171.96133: done queuing things up, now waiting for results queue to drain 13355 1727096171.96135: waiting for pending results... 13355 1727096171.96416: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 13355 1727096171.96541: in run() - task 0afff68d-5257-c514-593f-000000000635 13355 1727096171.96563: variable 'ansible_search_path' from source: unknown 13355 1727096171.96575: variable 'ansible_search_path' from source: unknown 13355 1727096171.96619: calling self._execute() 13355 1727096171.96725: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096171.96737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096171.96752: variable 'omit' from source: magic vars 13355 1727096171.97137: variable 'ansible_distribution_major_version' from source: facts 13355 1727096171.97154: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096171.97278: variable 'profile_stat' from source: set_fact 13355 1727096171.97294: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096171.97300: when evaluation is False, skipping this task 13355 1727096171.97306: _execute() done 13355 1727096171.97312: dumping result to json 13355 1727096171.97318: done dumping result, returning 13355 1727096171.97327: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0afff68d-5257-c514-593f-000000000635] 13355 1727096171.97335: sending task result for task 0afff68d-5257-c514-593f-000000000635 13355 1727096171.97439: done sending task result for task 0afff68d-5257-c514-593f-000000000635 13355 1727096171.97445: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096171.97502: no more pending results, returning what we have 13355 1727096171.97507: results queue empty 13355 1727096171.97508: checking for any_errors_fatal 13355 1727096171.97514: done checking for any_errors_fatal 13355 1727096171.97514: checking for max_fail_percentage 13355 1727096171.97516: done checking for max_fail_percentage 13355 1727096171.97517: checking to see if all hosts have failed and the running result is not ok 13355 1727096171.97518: done checking to see if all hosts have failed 13355 1727096171.97518: getting the remaining hosts for this loop 13355 1727096171.97520: done getting the remaining hosts for this loop 13355 1727096171.97523: getting the next task for host managed_node3 13355 1727096171.97531: done getting next task for host managed_node3 13355 1727096171.97534: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13355 1727096171.97538: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096171.97543: getting variables 13355 1727096171.97544: in VariableManager get_vars() 13355 1727096171.97601: Calling all_inventory to load vars for managed_node3 13355 1727096171.97604: Calling groups_inventory to load vars for managed_node3 13355 1727096171.97606: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096171.97619: Calling all_plugins_play to load vars for managed_node3 13355 1727096171.97622: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096171.97624: Calling groups_plugins_play to load vars for managed_node3 13355 1727096171.99211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.00837: done with get_vars() 13355 1727096172.00861: done getting variables 13355 1727096172.00924: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096172.01039: variable 'profile' from source: include params 13355 1727096172.01043: variable 'item' from source: include params 13355 1727096172.01102: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:56:12 -0400 (0:00:00.053) 0:00:21.271 ****** 13355 1727096172.01134: entering _queue_task() for managed_node3/assert 13355 1727096172.01687: worker is 1 (out of 1 available) 13355 1727096172.01697: exiting _queue_task() for managed_node3/assert 13355 1727096172.01708: done queuing things up, now waiting for results queue to drain 13355 1727096172.01710: waiting for pending results... 13355 1727096172.01798: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 13355 1727096172.01896: in run() - task 0afff68d-5257-c514-593f-00000000035d 13355 1727096172.01914: variable 'ansible_search_path' from source: unknown 13355 1727096172.01920: variable 'ansible_search_path' from source: unknown 13355 1727096172.01963: calling self._execute() 13355 1727096172.02080: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.02092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.02107: variable 'omit' from source: magic vars 13355 1727096172.02477: variable 'ansible_distribution_major_version' from source: facts 13355 1727096172.02493: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096172.02504: variable 'omit' from source: magic vars 13355 1727096172.02543: variable 'omit' from source: magic vars 13355 1727096172.02651: variable 'profile' from source: include params 13355 1727096172.02661: variable 'item' from source: include params 13355 1727096172.02799: variable 'item' from source: include params 13355 1727096172.02802: variable 'omit' from source: magic vars 13355 1727096172.02805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096172.02844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096172.02870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096172.02893: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.02915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.02949: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096172.02957: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.02964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.03071: Set connection var ansible_shell_executable to /bin/sh 13355 1727096172.03083: Set connection var ansible_shell_type to sh 13355 1727096172.03092: Set connection var ansible_pipelining to False 13355 1727096172.03100: Set connection var ansible_connection to ssh 13355 1727096172.03108: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096172.03123: Set connection var ansible_timeout to 10 13355 1727096172.03233: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.03236: variable 'ansible_connection' from source: unknown 13355 1727096172.03238: variable 'ansible_module_compression' from source: unknown 13355 1727096172.03240: variable 'ansible_shell_type' from source: unknown 13355 1727096172.03243: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.03244: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.03246: variable 'ansible_pipelining' from source: unknown 13355 1727096172.03249: variable 'ansible_timeout' from source: unknown 13355 1727096172.03250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.03330: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096172.03351: variable 'omit' from source: magic vars 13355 1727096172.03363: starting attempt loop 13355 1727096172.03373: running the handler 13355 1727096172.03494: variable 'lsr_net_profile_exists' from source: set_fact 13355 1727096172.03505: Evaluated conditional (lsr_net_profile_exists): True 13355 1727096172.03514: handler run complete 13355 1727096172.03533: attempt loop complete, returning result 13355 1727096172.03540: _execute() done 13355 1727096172.03546: dumping result to json 13355 1727096172.03556: done dumping result, returning 13355 1727096172.03572: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [0afff68d-5257-c514-593f-00000000035d] 13355 1727096172.03582: sending task result for task 0afff68d-5257-c514-593f-00000000035d 13355 1727096172.03739: done sending task result for task 0afff68d-5257-c514-593f-00000000035d 13355 1727096172.03743: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096172.03829: no more pending results, returning what we have 13355 1727096172.03833: results queue empty 13355 1727096172.03834: checking for any_errors_fatal 13355 1727096172.03841: done checking for any_errors_fatal 13355 1727096172.03842: checking for max_fail_percentage 13355 1727096172.03844: done checking for max_fail_percentage 13355 1727096172.03845: checking to see if all hosts have failed and the running result is not ok 13355 1727096172.03846: done checking to see if all hosts have failed 13355 1727096172.03847: getting the remaining hosts for this loop 13355 1727096172.03848: done getting the remaining hosts for this loop 13355 1727096172.03852: getting the next task for host managed_node3 13355 1727096172.03858: done getting next task for host managed_node3 13355 1727096172.03861: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13355 1727096172.03864: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096172.03870: getting variables 13355 1727096172.03871: in VariableManager get_vars() 13355 1727096172.03930: Calling all_inventory to load vars for managed_node3 13355 1727096172.03932: Calling groups_inventory to load vars for managed_node3 13355 1727096172.03935: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096172.03947: Calling all_plugins_play to load vars for managed_node3 13355 1727096172.03950: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096172.03953: Calling groups_plugins_play to load vars for managed_node3 13355 1727096172.05454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.06943: done with get_vars() 13355 1727096172.06976: done getting variables 13355 1727096172.07043: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096172.07162: variable 'profile' from source: include params 13355 1727096172.07166: variable 'item' from source: include params 13355 1727096172.07225: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:56:12 -0400 (0:00:00.061) 0:00:21.333 ****** 13355 1727096172.07264: entering _queue_task() for managed_node3/assert 13355 1727096172.07628: worker is 1 (out of 1 available) 13355 1727096172.07641: exiting _queue_task() for managed_node3/assert 13355 1727096172.07656: done queuing things up, now waiting for results queue to drain 13355 1727096172.07657: waiting for pending results... 13355 1727096172.08085: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 13355 1727096172.08090: in run() - task 0afff68d-5257-c514-593f-00000000035e 13355 1727096172.08093: variable 'ansible_search_path' from source: unknown 13355 1727096172.08095: variable 'ansible_search_path' from source: unknown 13355 1727096172.08098: calling self._execute() 13355 1727096172.08196: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.08210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.08224: variable 'omit' from source: magic vars 13355 1727096172.08588: variable 'ansible_distribution_major_version' from source: facts 13355 1727096172.08605: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096172.08616: variable 'omit' from source: magic vars 13355 1727096172.08663: variable 'omit' from source: magic vars 13355 1727096172.08773: variable 'profile' from source: include params 13355 1727096172.08783: variable 'item' from source: include params 13355 1727096172.08848: variable 'item' from source: include params 13355 1727096172.08880: variable 'omit' from source: magic vars 13355 1727096172.08926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096172.08975: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096172.09082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096172.09086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.09088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.09091: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096172.09093: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.09095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.09200: Set connection var ansible_shell_executable to /bin/sh 13355 1727096172.09211: Set connection var ansible_shell_type to sh 13355 1727096172.09220: Set connection var ansible_pipelining to False 13355 1727096172.09227: Set connection var ansible_connection to ssh 13355 1727096172.09236: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096172.09245: Set connection var ansible_timeout to 10 13355 1727096172.09274: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.09281: variable 'ansible_connection' from source: unknown 13355 1727096172.09287: variable 'ansible_module_compression' from source: unknown 13355 1727096172.09295: variable 'ansible_shell_type' from source: unknown 13355 1727096172.09305: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.09312: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.09319: variable 'ansible_pipelining' from source: unknown 13355 1727096172.09325: variable 'ansible_timeout' from source: unknown 13355 1727096172.09333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.09476: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096172.09518: variable 'omit' from source: magic vars 13355 1727096172.09521: starting attempt loop 13355 1727096172.09524: running the handler 13355 1727096172.09620: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13355 1727096172.09635: Evaluated conditional (lsr_net_profile_ansible_managed): True 13355 1727096172.09644: handler run complete 13355 1727096172.09737: attempt loop complete, returning result 13355 1727096172.09740: _execute() done 13355 1727096172.09742: dumping result to json 13355 1727096172.09745: done dumping result, returning 13355 1727096172.09747: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0afff68d-5257-c514-593f-00000000035e] 13355 1727096172.09750: sending task result for task 0afff68d-5257-c514-593f-00000000035e 13355 1727096172.09817: done sending task result for task 0afff68d-5257-c514-593f-00000000035e 13355 1727096172.09820: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096172.09890: no more pending results, returning what we have 13355 1727096172.09894: results queue empty 13355 1727096172.09895: checking for any_errors_fatal 13355 1727096172.09903: done checking for any_errors_fatal 13355 1727096172.09904: checking for max_fail_percentage 13355 1727096172.09906: done checking for max_fail_percentage 13355 1727096172.09907: checking to see if all hosts have failed and the running result is not ok 13355 1727096172.09907: done checking to see if all hosts have failed 13355 1727096172.09908: getting the remaining hosts for this loop 13355 1727096172.09909: done getting the remaining hosts for this loop 13355 1727096172.09913: getting the next task for host managed_node3 13355 1727096172.09920: done getting next task for host managed_node3 13355 1727096172.09922: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13355 1727096172.09926: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096172.09930: getting variables 13355 1727096172.09932: in VariableManager get_vars() 13355 1727096172.09992: Calling all_inventory to load vars for managed_node3 13355 1727096172.09995: Calling groups_inventory to load vars for managed_node3 13355 1727096172.09998: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096172.10010: Calling all_plugins_play to load vars for managed_node3 13355 1727096172.10014: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096172.10017: Calling groups_plugins_play to load vars for managed_node3 13355 1727096172.11699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.13205: done with get_vars() 13355 1727096172.13236: done getting variables 13355 1727096172.13298: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096172.13416: variable 'profile' from source: include params 13355 1727096172.13420: variable 'item' from source: include params 13355 1727096172.13481: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:56:12 -0400 (0:00:00.062) 0:00:21.395 ****** 13355 1727096172.13514: entering _queue_task() for managed_node3/assert 13355 1727096172.13854: worker is 1 (out of 1 available) 13355 1727096172.13869: exiting _queue_task() for managed_node3/assert 13355 1727096172.13883: done queuing things up, now waiting for results queue to drain 13355 1727096172.13884: waiting for pending results... 13355 1727096172.14286: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 13355 1727096172.14291: in run() - task 0afff68d-5257-c514-593f-00000000035f 13355 1727096172.14293: variable 'ansible_search_path' from source: unknown 13355 1727096172.14296: variable 'ansible_search_path' from source: unknown 13355 1727096172.14323: calling self._execute() 13355 1727096172.14424: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.14436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.14449: variable 'omit' from source: magic vars 13355 1727096172.14823: variable 'ansible_distribution_major_version' from source: facts 13355 1727096172.14840: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096172.14851: variable 'omit' from source: magic vars 13355 1727096172.14893: variable 'omit' from source: magic vars 13355 1727096172.15004: variable 'profile' from source: include params 13355 1727096172.15014: variable 'item' from source: include params 13355 1727096172.15086: variable 'item' from source: include params 13355 1727096172.15111: variable 'omit' from source: magic vars 13355 1727096172.15159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096172.15201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096172.15225: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096172.15251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.15267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.15302: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096172.15310: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.15317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.15423: Set connection var ansible_shell_executable to /bin/sh 13355 1727096172.15436: Set connection var ansible_shell_type to sh 13355 1727096172.15445: Set connection var ansible_pipelining to False 13355 1727096172.15453: Set connection var ansible_connection to ssh 13355 1727096172.15465: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096172.15479: Set connection var ansible_timeout to 10 13355 1727096172.15581: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.15584: variable 'ansible_connection' from source: unknown 13355 1727096172.15587: variable 'ansible_module_compression' from source: unknown 13355 1727096172.15588: variable 'ansible_shell_type' from source: unknown 13355 1727096172.15590: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.15592: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.15594: variable 'ansible_pipelining' from source: unknown 13355 1727096172.15596: variable 'ansible_timeout' from source: unknown 13355 1727096172.15598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.15691: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096172.15706: variable 'omit' from source: magic vars 13355 1727096172.15716: starting attempt loop 13355 1727096172.15722: running the handler 13355 1727096172.15838: variable 'lsr_net_profile_fingerprint' from source: set_fact 13355 1727096172.15847: Evaluated conditional (lsr_net_profile_fingerprint): True 13355 1727096172.15857: handler run complete 13355 1727096172.15876: attempt loop complete, returning result 13355 1727096172.15883: _execute() done 13355 1727096172.15888: dumping result to json 13355 1727096172.15895: done dumping result, returning 13355 1727096172.16016: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [0afff68d-5257-c514-593f-00000000035f] 13355 1727096172.16019: sending task result for task 0afff68d-5257-c514-593f-00000000035f 13355 1727096172.16086: done sending task result for task 0afff68d-5257-c514-593f-00000000035f 13355 1727096172.16090: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096172.16166: no more pending results, returning what we have 13355 1727096172.16172: results queue empty 13355 1727096172.16173: checking for any_errors_fatal 13355 1727096172.16182: done checking for any_errors_fatal 13355 1727096172.16183: checking for max_fail_percentage 13355 1727096172.16184: done checking for max_fail_percentage 13355 1727096172.16185: checking to see if all hosts have failed and the running result is not ok 13355 1727096172.16186: done checking to see if all hosts have failed 13355 1727096172.16186: getting the remaining hosts for this loop 13355 1727096172.16188: done getting the remaining hosts for this loop 13355 1727096172.16192: getting the next task for host managed_node3 13355 1727096172.16201: done getting next task for host managed_node3 13355 1727096172.16205: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13355 1727096172.16208: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096172.16212: getting variables 13355 1727096172.16214: in VariableManager get_vars() 13355 1727096172.16273: Calling all_inventory to load vars for managed_node3 13355 1727096172.16276: Calling groups_inventory to load vars for managed_node3 13355 1727096172.16279: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096172.16292: Calling all_plugins_play to load vars for managed_node3 13355 1727096172.16295: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096172.16298: Calling groups_plugins_play to load vars for managed_node3 13355 1727096172.18057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.19572: done with get_vars() 13355 1727096172.19609: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Monday 23 September 2024 08:56:12 -0400 (0:00:00.061) 0:00:21.457 ****** 13355 1727096172.19716: entering _queue_task() for managed_node3/include_tasks 13355 1727096172.20081: worker is 1 (out of 1 available) 13355 1727096172.20094: exiting _queue_task() for managed_node3/include_tasks 13355 1727096172.20108: done queuing things up, now waiting for results queue to drain 13355 1727096172.20110: waiting for pending results... 13355 1727096172.20488: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 13355 1727096172.20497: in run() - task 0afff68d-5257-c514-593f-000000000363 13355 1727096172.20517: variable 'ansible_search_path' from source: unknown 13355 1727096172.20524: variable 'ansible_search_path' from source: unknown 13355 1727096172.20563: calling self._execute() 13355 1727096172.20663: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.20676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.20689: variable 'omit' from source: magic vars 13355 1727096172.21049: variable 'ansible_distribution_major_version' from source: facts 13355 1727096172.21069: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096172.21082: _execute() done 13355 1727096172.21091: dumping result to json 13355 1727096172.21099: done dumping result, returning 13355 1727096172.21110: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [0afff68d-5257-c514-593f-000000000363] 13355 1727096172.21119: sending task result for task 0afff68d-5257-c514-593f-000000000363 13355 1727096172.21260: no more pending results, returning what we have 13355 1727096172.21266: in VariableManager get_vars() 13355 1727096172.21331: Calling all_inventory to load vars for managed_node3 13355 1727096172.21334: Calling groups_inventory to load vars for managed_node3 13355 1727096172.21336: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096172.21351: Calling all_plugins_play to load vars for managed_node3 13355 1727096172.21354: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096172.21358: Calling groups_plugins_play to load vars for managed_node3 13355 1727096172.22281: done sending task result for task 0afff68d-5257-c514-593f-000000000363 13355 1727096172.22285: WORKER PROCESS EXITING 13355 1727096172.23109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.24578: done with get_vars() 13355 1727096172.24598: variable 'ansible_search_path' from source: unknown 13355 1727096172.24600: variable 'ansible_search_path' from source: unknown 13355 1727096172.24638: we have included files to process 13355 1727096172.24640: generating all_blocks data 13355 1727096172.24642: done generating all_blocks data 13355 1727096172.24646: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096172.24647: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096172.24649: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13355 1727096172.25538: done processing included file 13355 1727096172.25541: iterating over new_blocks loaded from include file 13355 1727096172.25542: in VariableManager get_vars() 13355 1727096172.25572: done with get_vars() 13355 1727096172.25574: filtering new block on tags 13355 1727096172.25599: done filtering new block on tags 13355 1727096172.25602: in VariableManager get_vars() 13355 1727096172.25626: done with get_vars() 13355 1727096172.25628: filtering new block on tags 13355 1727096172.25649: done filtering new block on tags 13355 1727096172.25651: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 13355 1727096172.25657: extending task lists for all hosts with included blocks 13355 1727096172.25824: done extending task lists 13355 1727096172.25825: done processing included files 13355 1727096172.25826: results queue empty 13355 1727096172.25826: checking for any_errors_fatal 13355 1727096172.25830: done checking for any_errors_fatal 13355 1727096172.25831: checking for max_fail_percentage 13355 1727096172.25832: done checking for max_fail_percentage 13355 1727096172.25833: checking to see if all hosts have failed and the running result is not ok 13355 1727096172.25834: done checking to see if all hosts have failed 13355 1727096172.25834: getting the remaining hosts for this loop 13355 1727096172.25836: done getting the remaining hosts for this loop 13355 1727096172.25838: getting the next task for host managed_node3 13355 1727096172.25842: done getting next task for host managed_node3 13355 1727096172.25844: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13355 1727096172.25847: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096172.25849: getting variables 13355 1727096172.25850: in VariableManager get_vars() 13355 1727096172.25870: Calling all_inventory to load vars for managed_node3 13355 1727096172.25872: Calling groups_inventory to load vars for managed_node3 13355 1727096172.25875: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096172.25880: Calling all_plugins_play to load vars for managed_node3 13355 1727096172.25883: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096172.25885: Calling groups_plugins_play to load vars for managed_node3 13355 1727096172.27094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.28916: done with get_vars() 13355 1727096172.28940: done getting variables 13355 1727096172.28986: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Monday 23 September 2024 08:56:12 -0400 (0:00:00.093) 0:00:21.550 ****** 13355 1727096172.29019: entering _queue_task() for managed_node3/set_fact 13355 1727096172.29409: worker is 1 (out of 1 available) 13355 1727096172.29420: exiting _queue_task() for managed_node3/set_fact 13355 1727096172.29433: done queuing things up, now waiting for results queue to drain 13355 1727096172.29434: waiting for pending results... 13355 1727096172.29670: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 13355 1727096172.29780: in run() - task 0afff68d-5257-c514-593f-000000000674 13355 1727096172.29801: variable 'ansible_search_path' from source: unknown 13355 1727096172.29808: variable 'ansible_search_path' from source: unknown 13355 1727096172.29848: calling self._execute() 13355 1727096172.29953: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.29966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.29982: variable 'omit' from source: magic vars 13355 1727096172.30350: variable 'ansible_distribution_major_version' from source: facts 13355 1727096172.30370: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096172.30382: variable 'omit' from source: magic vars 13355 1727096172.30440: variable 'omit' from source: magic vars 13355 1727096172.30773: variable 'omit' from source: magic vars 13355 1727096172.30776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096172.30893: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096172.30896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096172.30899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.30901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.30903: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096172.30905: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.30907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.31141: Set connection var ansible_shell_executable to /bin/sh 13355 1727096172.31153: Set connection var ansible_shell_type to sh 13355 1727096172.31164: Set connection var ansible_pipelining to False 13355 1727096172.31182: Set connection var ansible_connection to ssh 13355 1727096172.31192: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096172.31200: Set connection var ansible_timeout to 10 13355 1727096172.31233: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.31242: variable 'ansible_connection' from source: unknown 13355 1727096172.31249: variable 'ansible_module_compression' from source: unknown 13355 1727096172.31255: variable 'ansible_shell_type' from source: unknown 13355 1727096172.31261: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.31334: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.31342: variable 'ansible_pipelining' from source: unknown 13355 1727096172.31350: variable 'ansible_timeout' from source: unknown 13355 1727096172.31358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.31614: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096172.31632: variable 'omit' from source: magic vars 13355 1727096172.31662: starting attempt loop 13355 1727096172.31672: running the handler 13355 1727096172.31762: handler run complete 13355 1727096172.31776: attempt loop complete, returning result 13355 1727096172.31784: _execute() done 13355 1727096172.31791: dumping result to json 13355 1727096172.31798: done dumping result, returning 13355 1727096172.31807: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0afff68d-5257-c514-593f-000000000674] 13355 1727096172.31816: sending task result for task 0afff68d-5257-c514-593f-000000000674 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13355 1727096172.32060: no more pending results, returning what we have 13355 1727096172.32064: results queue empty 13355 1727096172.32065: checking for any_errors_fatal 13355 1727096172.32067: done checking for any_errors_fatal 13355 1727096172.32069: checking for max_fail_percentage 13355 1727096172.32071: done checking for max_fail_percentage 13355 1727096172.32071: checking to see if all hosts have failed and the running result is not ok 13355 1727096172.32072: done checking to see if all hosts have failed 13355 1727096172.32073: getting the remaining hosts for this loop 13355 1727096172.32074: done getting the remaining hosts for this loop 13355 1727096172.32078: getting the next task for host managed_node3 13355 1727096172.32085: done getting next task for host managed_node3 13355 1727096172.32088: ^ task is: TASK: Stat profile file 13355 1727096172.32093: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096172.32098: getting variables 13355 1727096172.32099: in VariableManager get_vars() 13355 1727096172.32156: Calling all_inventory to load vars for managed_node3 13355 1727096172.32159: Calling groups_inventory to load vars for managed_node3 13355 1727096172.32162: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096172.32384: Calling all_plugins_play to load vars for managed_node3 13355 1727096172.32388: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096172.32392: Calling groups_plugins_play to load vars for managed_node3 13355 1727096172.33082: done sending task result for task 0afff68d-5257-c514-593f-000000000674 13355 1727096172.33086: WORKER PROCESS EXITING 13355 1727096172.33986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.35528: done with get_vars() 13355 1727096172.35558: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Monday 23 September 2024 08:56:12 -0400 (0:00:00.066) 0:00:21.617 ****** 13355 1727096172.35657: entering _queue_task() for managed_node3/stat 13355 1727096172.36008: worker is 1 (out of 1 available) 13355 1727096172.36019: exiting _queue_task() for managed_node3/stat 13355 1727096172.36032: done queuing things up, now waiting for results queue to drain 13355 1727096172.36034: waiting for pending results... 13355 1727096172.36305: running TaskExecutor() for managed_node3/TASK: Stat profile file 13355 1727096172.36433: in run() - task 0afff68d-5257-c514-593f-000000000675 13355 1727096172.36451: variable 'ansible_search_path' from source: unknown 13355 1727096172.36458: variable 'ansible_search_path' from source: unknown 13355 1727096172.36503: calling self._execute() 13355 1727096172.36600: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.36612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.36627: variable 'omit' from source: magic vars 13355 1727096172.36993: variable 'ansible_distribution_major_version' from source: facts 13355 1727096172.37010: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096172.37022: variable 'omit' from source: magic vars 13355 1727096172.37076: variable 'omit' from source: magic vars 13355 1727096172.37177: variable 'profile' from source: include params 13355 1727096172.37188: variable 'item' from source: include params 13355 1727096172.37256: variable 'item' from source: include params 13355 1727096172.37283: variable 'omit' from source: magic vars 13355 1727096172.37326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096172.37372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096172.37396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096172.37418: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.37438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.37478: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096172.37487: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.37494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.37599: Set connection var ansible_shell_executable to /bin/sh 13355 1727096172.37610: Set connection var ansible_shell_type to sh 13355 1727096172.37872: Set connection var ansible_pipelining to False 13355 1727096172.37875: Set connection var ansible_connection to ssh 13355 1727096172.37877: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096172.37880: Set connection var ansible_timeout to 10 13355 1727096172.37881: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.37883: variable 'ansible_connection' from source: unknown 13355 1727096172.37885: variable 'ansible_module_compression' from source: unknown 13355 1727096172.37887: variable 'ansible_shell_type' from source: unknown 13355 1727096172.37889: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.37890: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.37892: variable 'ansible_pipelining' from source: unknown 13355 1727096172.37894: variable 'ansible_timeout' from source: unknown 13355 1727096172.37896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.37899: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096172.37914: variable 'omit' from source: magic vars 13355 1727096172.37923: starting attempt loop 13355 1727096172.37928: running the handler 13355 1727096172.37943: _low_level_execute_command(): starting 13355 1727096172.37952: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096172.38653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096172.38689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096172.38789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096172.38802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.38828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.38907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.40613: stdout chunk (state=3): >>>/root <<< 13355 1727096172.41014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.41019: stderr chunk (state=3): >>><<< 13355 1727096172.41021: stdout chunk (state=3): >>><<< 13355 1727096172.41024: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096172.41027: _low_level_execute_command(): starting 13355 1727096172.41031: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637 `" && echo ansible-tmp-1727096172.4079707-14346-25456019309637="` echo /root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637 `" ) && sleep 0' 13355 1727096172.42492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.42529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096172.42595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.42631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.42726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.44755: stdout chunk (state=3): >>>ansible-tmp-1727096172.4079707-14346-25456019309637=/root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637 <<< 13355 1727096172.44898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.44905: stdout chunk (state=3): >>><<< 13355 1727096172.44911: stderr chunk (state=3): >>><<< 13355 1727096172.44938: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096172.4079707-14346-25456019309637=/root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096172.45188: variable 'ansible_module_compression' from source: unknown 13355 1727096172.45192: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13355 1727096172.45194: variable 'ansible_facts' from source: unknown 13355 1727096172.45196: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/AnsiballZ_stat.py 13355 1727096172.45390: Sending initial data 13355 1727096172.45393: Sent initial data (152 bytes) 13355 1727096172.46185: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096172.46192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096172.46203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096172.46216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096172.46233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096172.46236: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096172.46244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.46262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096172.46276: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096172.46280: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096172.46289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096172.46298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096172.46309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096172.46317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096172.46323: stderr chunk (state=3): >>>debug2: match found <<< 13355 1727096172.46333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.46407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.46415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.46481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.48133: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13355 1727096172.48196: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096172.48214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096172.48254: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp6jia4ee_ /root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/AnsiballZ_stat.py <<< 13355 1727096172.48258: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/AnsiballZ_stat.py" <<< 13355 1727096172.48302: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp6jia4ee_" to remote "/root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/AnsiballZ_stat.py" <<< 13355 1727096172.49165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.49330: stderr chunk (state=3): >>><<< 13355 1727096172.49333: stdout chunk (state=3): >>><<< 13355 1727096172.49343: done transferring module to remote 13355 1727096172.49386: _low_level_execute_command(): starting 13355 1727096172.49420: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/ /root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/AnsiballZ_stat.py && sleep 0' 13355 1727096172.50138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096172.50185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096172.50200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096172.50256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.50308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096172.50354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.50414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.50418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.52348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.52420: stderr chunk (state=3): >>><<< 13355 1727096172.52423: stdout chunk (state=3): >>><<< 13355 1727096172.52529: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096172.52532: _low_level_execute_command(): starting 13355 1727096172.52534: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/AnsiballZ_stat.py && sleep 0' 13355 1727096172.53192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096172.53234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.53247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096172.53342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.53370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.53450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.69472: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}}<<< 13355 1727096172.69506: stdout chunk (state=3): >>> <<< 13355 1727096172.70965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096172.70972: stdout chunk (state=3): >>><<< 13355 1727096172.70974: stderr chunk (state=3): >>><<< 13355 1727096172.71113: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096172.71118: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096172.71121: _low_level_execute_command(): starting 13355 1727096172.71124: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096172.4079707-14346-25456019309637/ > /dev/null 2>&1 && sleep 0' 13355 1727096172.71733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096172.71786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096172.71878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.71903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.71932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.71999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.73919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.73986: stderr chunk (state=3): >>><<< 13355 1727096172.73996: stdout chunk (state=3): >>><<< 13355 1727096172.74020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096172.74034: handler run complete 13355 1727096172.74079: attempt loop complete, returning result 13355 1727096172.74082: _execute() done 13355 1727096172.74084: dumping result to json 13355 1727096172.74093: done dumping result, returning 13355 1727096172.74106: done running TaskExecutor() for managed_node3/TASK: Stat profile file [0afff68d-5257-c514-593f-000000000675] 13355 1727096172.74115: sending task result for task 0afff68d-5257-c514-593f-000000000675 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 13355 1727096172.74288: no more pending results, returning what we have 13355 1727096172.74292: results queue empty 13355 1727096172.74293: checking for any_errors_fatal 13355 1727096172.74298: done checking for any_errors_fatal 13355 1727096172.74299: checking for max_fail_percentage 13355 1727096172.74300: done checking for max_fail_percentage 13355 1727096172.74301: checking to see if all hosts have failed and the running result is not ok 13355 1727096172.74302: done checking to see if all hosts have failed 13355 1727096172.74303: getting the remaining hosts for this loop 13355 1727096172.74304: done getting the remaining hosts for this loop 13355 1727096172.74308: getting the next task for host managed_node3 13355 1727096172.74315: done getting next task for host managed_node3 13355 1727096172.74317: ^ task is: TASK: Set NM profile exist flag based on the profile files 13355 1727096172.74322: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096172.74326: getting variables 13355 1727096172.74328: in VariableManager get_vars() 13355 1727096172.74388: Calling all_inventory to load vars for managed_node3 13355 1727096172.74391: Calling groups_inventory to load vars for managed_node3 13355 1727096172.74393: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096172.74406: Calling all_plugins_play to load vars for managed_node3 13355 1727096172.74409: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096172.74413: Calling groups_plugins_play to load vars for managed_node3 13355 1727096172.75218: done sending task result for task 0afff68d-5257-c514-593f-000000000675 13355 1727096172.75227: WORKER PROCESS EXITING 13355 1727096172.76122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.76982: done with get_vars() 13355 1727096172.77003: done getting variables 13355 1727096172.77049: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Monday 23 September 2024 08:56:12 -0400 (0:00:00.414) 0:00:22.031 ****** 13355 1727096172.77076: entering _queue_task() for managed_node3/set_fact 13355 1727096172.77355: worker is 1 (out of 1 available) 13355 1727096172.77371: exiting _queue_task() for managed_node3/set_fact 13355 1727096172.77388: done queuing things up, now waiting for results queue to drain 13355 1727096172.77390: waiting for pending results... 13355 1727096172.77637: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 13355 1727096172.77734: in run() - task 0afff68d-5257-c514-593f-000000000676 13355 1727096172.77744: variable 'ansible_search_path' from source: unknown 13355 1727096172.77748: variable 'ansible_search_path' from source: unknown 13355 1727096172.77974: calling self._execute() 13355 1727096172.77977: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.77980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.77984: variable 'omit' from source: magic vars 13355 1727096172.78320: variable 'ansible_distribution_major_version' from source: facts 13355 1727096172.78331: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096172.78467: variable 'profile_stat' from source: set_fact 13355 1727096172.78481: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096172.78484: when evaluation is False, skipping this task 13355 1727096172.78487: _execute() done 13355 1727096172.78490: dumping result to json 13355 1727096172.78492: done dumping result, returning 13355 1727096172.78501: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [0afff68d-5257-c514-593f-000000000676] 13355 1727096172.78503: sending task result for task 0afff68d-5257-c514-593f-000000000676 13355 1727096172.78636: done sending task result for task 0afff68d-5257-c514-593f-000000000676 13355 1727096172.78639: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096172.78723: no more pending results, returning what we have 13355 1727096172.78727: results queue empty 13355 1727096172.78728: checking for any_errors_fatal 13355 1727096172.78734: done checking for any_errors_fatal 13355 1727096172.78734: checking for max_fail_percentage 13355 1727096172.78736: done checking for max_fail_percentage 13355 1727096172.78737: checking to see if all hosts have failed and the running result is not ok 13355 1727096172.78737: done checking to see if all hosts have failed 13355 1727096172.78738: getting the remaining hosts for this loop 13355 1727096172.78739: done getting the remaining hosts for this loop 13355 1727096172.78743: getting the next task for host managed_node3 13355 1727096172.78754: done getting next task for host managed_node3 13355 1727096172.78757: ^ task is: TASK: Get NM profile info 13355 1727096172.78761: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096172.78765: getting variables 13355 1727096172.78768: in VariableManager get_vars() 13355 1727096172.78813: Calling all_inventory to load vars for managed_node3 13355 1727096172.78815: Calling groups_inventory to load vars for managed_node3 13355 1727096172.78817: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096172.78828: Calling all_plugins_play to load vars for managed_node3 13355 1727096172.78830: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096172.78833: Calling groups_plugins_play to load vars for managed_node3 13355 1727096172.80079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096172.81195: done with get_vars() 13355 1727096172.81216: done getting variables 13355 1727096172.81264: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Monday 23 September 2024 08:56:12 -0400 (0:00:00.042) 0:00:22.073 ****** 13355 1727096172.81290: entering _queue_task() for managed_node3/shell 13355 1727096172.81546: worker is 1 (out of 1 available) 13355 1727096172.81559: exiting _queue_task() for managed_node3/shell 13355 1727096172.81573: done queuing things up, now waiting for results queue to drain 13355 1727096172.81575: waiting for pending results... 13355 1727096172.81748: running TaskExecutor() for managed_node3/TASK: Get NM profile info 13355 1727096172.81829: in run() - task 0afff68d-5257-c514-593f-000000000677 13355 1727096172.81841: variable 'ansible_search_path' from source: unknown 13355 1727096172.81844: variable 'ansible_search_path' from source: unknown 13355 1727096172.81875: calling self._execute() 13355 1727096172.81955: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.81963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.81973: variable 'omit' from source: magic vars 13355 1727096172.82252: variable 'ansible_distribution_major_version' from source: facts 13355 1727096172.82263: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096172.82270: variable 'omit' from source: magic vars 13355 1727096172.82304: variable 'omit' from source: magic vars 13355 1727096172.82380: variable 'profile' from source: include params 13355 1727096172.82384: variable 'item' from source: include params 13355 1727096172.82429: variable 'item' from source: include params 13355 1727096172.82445: variable 'omit' from source: magic vars 13355 1727096172.82485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096172.82574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096172.82578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096172.82581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.82583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096172.82612: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096172.82615: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.82617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.82745: Set connection var ansible_shell_executable to /bin/sh 13355 1727096172.82748: Set connection var ansible_shell_type to sh 13355 1727096172.82751: Set connection var ansible_pipelining to False 13355 1727096172.82753: Set connection var ansible_connection to ssh 13355 1727096172.82755: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096172.82758: Set connection var ansible_timeout to 10 13355 1727096172.82793: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.82796: variable 'ansible_connection' from source: unknown 13355 1727096172.82799: variable 'ansible_module_compression' from source: unknown 13355 1727096172.82810: variable 'ansible_shell_type' from source: unknown 13355 1727096172.82813: variable 'ansible_shell_executable' from source: unknown 13355 1727096172.82815: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096172.82817: variable 'ansible_pipelining' from source: unknown 13355 1727096172.82819: variable 'ansible_timeout' from source: unknown 13355 1727096172.82821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096172.83095: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096172.83098: variable 'omit' from source: magic vars 13355 1727096172.83100: starting attempt loop 13355 1727096172.83103: running the handler 13355 1727096172.83106: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096172.83108: _low_level_execute_command(): starting 13355 1727096172.83110: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096172.83698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096172.83725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.83788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096172.83801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.83817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.83873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.85566: stdout chunk (state=3): >>>/root <<< 13355 1727096172.85661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.85705: stderr chunk (state=3): >>><<< 13355 1727096172.85707: stdout chunk (state=3): >>><<< 13355 1727096172.85722: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096172.85737: _low_level_execute_command(): starting 13355 1727096172.85775: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868 `" && echo ansible-tmp-1727096172.8572721-14374-136781554214868="` echo /root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868 `" ) && sleep 0' 13355 1727096172.86213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096172.86224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096172.86228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096172.86233: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096172.86235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.86281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.86285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.86325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.88299: stdout chunk (state=3): >>>ansible-tmp-1727096172.8572721-14374-136781554214868=/root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868 <<< 13355 1727096172.88405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.88432: stderr chunk (state=3): >>><<< 13355 1727096172.88435: stdout chunk (state=3): >>><<< 13355 1727096172.88455: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096172.8572721-14374-136781554214868=/root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096172.88484: variable 'ansible_module_compression' from source: unknown 13355 1727096172.88530: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096172.88560: variable 'ansible_facts' from source: unknown 13355 1727096172.88621: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/AnsiballZ_command.py 13355 1727096172.88724: Sending initial data 13355 1727096172.88727: Sent initial data (156 bytes) 13355 1727096172.89164: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096172.89195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.89199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096172.89201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.89256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096172.89259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.89266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.89302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.90940: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096172.90969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096172.91002: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmphgst7_cr /root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/AnsiballZ_command.py <<< 13355 1727096172.91008: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/AnsiballZ_command.py" <<< 13355 1727096172.91041: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmphgst7_cr" to remote "/root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/AnsiballZ_command.py" <<< 13355 1727096172.91044: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/AnsiballZ_command.py" <<< 13355 1727096172.91544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.91594: stderr chunk (state=3): >>><<< 13355 1727096172.91598: stdout chunk (state=3): >>><<< 13355 1727096172.91638: done transferring module to remote 13355 1727096172.91647: _low_level_execute_command(): starting 13355 1727096172.91656: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/ /root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/AnsiballZ_command.py && sleep 0' 13355 1727096172.92126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096172.92129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.92136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096172.92138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096172.92140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.92173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096172.92188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.92233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096172.94065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096172.94088: stderr chunk (state=3): >>><<< 13355 1727096172.94091: stdout chunk (state=3): >>><<< 13355 1727096172.94106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096172.94108: _low_level_execute_command(): starting 13355 1727096172.94114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/AnsiballZ_command.py && sleep 0' 13355 1727096172.94547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096172.94551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096172.94573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096172.94587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096172.94591: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096172.94650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096172.94653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096172.94655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096172.94703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096173.12571: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-23 08:56:13.101561", "end": "2024-09-23 08:56:13.122542", "delta": "0:00:00.020981", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096173.14375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096173.14379: stdout chunk (state=3): >>><<< 13355 1727096173.14381: stderr chunk (state=3): >>><<< 13355 1727096173.14384: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-23 08:56:13.101561", "end": "2024-09-23 08:56:13.122542", "delta": "0:00:00.020981", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096173.14387: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096173.14394: _low_level_execute_command(): starting 13355 1727096173.14396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096172.8572721-14374-136781554214868/ > /dev/null 2>&1 && sleep 0' 13355 1727096173.15087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096173.15094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096173.15107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096173.15159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096173.15218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096173.15235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096173.15250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096173.15322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096173.17274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096173.17278: stderr chunk (state=3): >>><<< 13355 1727096173.17280: stdout chunk (state=3): >>><<< 13355 1727096173.17283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096173.17285: handler run complete 13355 1727096173.17291: Evaluated conditional (False): False 13355 1727096173.17301: attempt loop complete, returning result 13355 1727096173.17304: _execute() done 13355 1727096173.17306: dumping result to json 13355 1727096173.17312: done dumping result, returning 13355 1727096173.17322: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [0afff68d-5257-c514-593f-000000000677] 13355 1727096173.17326: sending task result for task 0afff68d-5257-c514-593f-000000000677 13355 1727096173.17431: done sending task result for task 0afff68d-5257-c514-593f-000000000677 13355 1727096173.17435: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020981", "end": "2024-09-23 08:56:13.122542", "rc": 0, "start": "2024-09-23 08:56:13.101561" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 13355 1727096173.17515: no more pending results, returning what we have 13355 1727096173.17519: results queue empty 13355 1727096173.17520: checking for any_errors_fatal 13355 1727096173.17527: done checking for any_errors_fatal 13355 1727096173.17527: checking for max_fail_percentage 13355 1727096173.17529: done checking for max_fail_percentage 13355 1727096173.17530: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.17531: done checking to see if all hosts have failed 13355 1727096173.17531: getting the remaining hosts for this loop 13355 1727096173.17533: done getting the remaining hosts for this loop 13355 1727096173.17536: getting the next task for host managed_node3 13355 1727096173.17544: done getting next task for host managed_node3 13355 1727096173.17547: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13355 1727096173.17552: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.17556: getting variables 13355 1727096173.17558: in VariableManager get_vars() 13355 1727096173.17729: Calling all_inventory to load vars for managed_node3 13355 1727096173.17731: Calling groups_inventory to load vars for managed_node3 13355 1727096173.17734: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.17746: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.17749: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.17752: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.19398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.20975: done with get_vars() 13355 1727096173.21009: done getting variables 13355 1727096173.21077: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Monday 23 September 2024 08:56:13 -0400 (0:00:00.398) 0:00:22.471 ****** 13355 1727096173.21114: entering _queue_task() for managed_node3/set_fact 13355 1727096173.21497: worker is 1 (out of 1 available) 13355 1727096173.21510: exiting _queue_task() for managed_node3/set_fact 13355 1727096173.21636: done queuing things up, now waiting for results queue to drain 13355 1727096173.21638: waiting for pending results... 13355 1727096173.22085: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13355 1727096173.22091: in run() - task 0afff68d-5257-c514-593f-000000000678 13355 1727096173.22094: variable 'ansible_search_path' from source: unknown 13355 1727096173.22097: variable 'ansible_search_path' from source: unknown 13355 1727096173.22100: calling self._execute() 13355 1727096173.22103: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.22105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.22108: variable 'omit' from source: magic vars 13355 1727096173.22460: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.22472: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.22606: variable 'nm_profile_exists' from source: set_fact 13355 1727096173.22632: Evaluated conditional (nm_profile_exists.rc == 0): True 13355 1727096173.22637: variable 'omit' from source: magic vars 13355 1727096173.22683: variable 'omit' from source: magic vars 13355 1727096173.22713: variable 'omit' from source: magic vars 13355 1727096173.22759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096173.22795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096173.22815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096173.22973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.22976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.22979: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096173.22981: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.22983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.22994: Set connection var ansible_shell_executable to /bin/sh 13355 1727096173.23001: Set connection var ansible_shell_type to sh 13355 1727096173.23007: Set connection var ansible_pipelining to False 13355 1727096173.23012: Set connection var ansible_connection to ssh 13355 1727096173.23017: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096173.23023: Set connection var ansible_timeout to 10 13355 1727096173.23056: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.23060: variable 'ansible_connection' from source: unknown 13355 1727096173.23062: variable 'ansible_module_compression' from source: unknown 13355 1727096173.23064: variable 'ansible_shell_type' from source: unknown 13355 1727096173.23066: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.23070: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.23072: variable 'ansible_pipelining' from source: unknown 13355 1727096173.23074: variable 'ansible_timeout' from source: unknown 13355 1727096173.23079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.23219: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096173.23230: variable 'omit' from source: magic vars 13355 1727096173.23236: starting attempt loop 13355 1727096173.23239: running the handler 13355 1727096173.23251: handler run complete 13355 1727096173.23268: attempt loop complete, returning result 13355 1727096173.23271: _execute() done 13355 1727096173.23274: dumping result to json 13355 1727096173.23276: done dumping result, returning 13355 1727096173.23286: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0afff68d-5257-c514-593f-000000000678] 13355 1727096173.23290: sending task result for task 0afff68d-5257-c514-593f-000000000678 ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13355 1727096173.23433: no more pending results, returning what we have 13355 1727096173.23437: results queue empty 13355 1727096173.23438: checking for any_errors_fatal 13355 1727096173.23448: done checking for any_errors_fatal 13355 1727096173.23448: checking for max_fail_percentage 13355 1727096173.23450: done checking for max_fail_percentage 13355 1727096173.23451: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.23451: done checking to see if all hosts have failed 13355 1727096173.23452: getting the remaining hosts for this loop 13355 1727096173.23453: done getting the remaining hosts for this loop 13355 1727096173.23457: getting the next task for host managed_node3 13355 1727096173.23468: done getting next task for host managed_node3 13355 1727096173.23471: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13355 1727096173.23480: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.23486: getting variables 13355 1727096173.23487: in VariableManager get_vars() 13355 1727096173.23545: Calling all_inventory to load vars for managed_node3 13355 1727096173.23548: Calling groups_inventory to load vars for managed_node3 13355 1727096173.23550: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.23562: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.23566: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.23778: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.24299: done sending task result for task 0afff68d-5257-c514-593f-000000000678 13355 1727096173.24303: WORKER PROCESS EXITING 13355 1727096173.25192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.26857: done with get_vars() 13355 1727096173.26883: done getting variables 13355 1727096173.26948: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096173.27081: variable 'profile' from source: include params 13355 1727096173.27085: variable 'item' from source: include params 13355 1727096173.27145: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Monday 23 September 2024 08:56:13 -0400 (0:00:00.060) 0:00:22.532 ****** 13355 1727096173.27188: entering _queue_task() for managed_node3/command 13355 1727096173.27557: worker is 1 (out of 1 available) 13355 1727096173.27573: exiting _queue_task() for managed_node3/command 13355 1727096173.27586: done queuing things up, now waiting for results queue to drain 13355 1727096173.27587: waiting for pending results... 13355 1727096173.27986: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 13355 1727096173.27995: in run() - task 0afff68d-5257-c514-593f-00000000067a 13355 1727096173.28014: variable 'ansible_search_path' from source: unknown 13355 1727096173.28020: variable 'ansible_search_path' from source: unknown 13355 1727096173.28059: calling self._execute() 13355 1727096173.28165: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.28180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.28198: variable 'omit' from source: magic vars 13355 1727096173.28636: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.28658: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.28797: variable 'profile_stat' from source: set_fact 13355 1727096173.28818: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096173.28825: when evaluation is False, skipping this task 13355 1727096173.28832: _execute() done 13355 1727096173.28838: dumping result to json 13355 1727096173.28852: done dumping result, returning 13355 1727096173.28870: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0afff68d-5257-c514-593f-00000000067a] 13355 1727096173.28880: sending task result for task 0afff68d-5257-c514-593f-00000000067a skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096173.29123: no more pending results, returning what we have 13355 1727096173.29128: results queue empty 13355 1727096173.29129: checking for any_errors_fatal 13355 1727096173.29136: done checking for any_errors_fatal 13355 1727096173.29137: checking for max_fail_percentage 13355 1727096173.29139: done checking for max_fail_percentage 13355 1727096173.29140: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.29141: done checking to see if all hosts have failed 13355 1727096173.29141: getting the remaining hosts for this loop 13355 1727096173.29143: done getting the remaining hosts for this loop 13355 1727096173.29147: getting the next task for host managed_node3 13355 1727096173.29158: done getting next task for host managed_node3 13355 1727096173.29161: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13355 1727096173.29166: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.29173: getting variables 13355 1727096173.29175: in VariableManager get_vars() 13355 1727096173.29233: Calling all_inventory to load vars for managed_node3 13355 1727096173.29236: Calling groups_inventory to load vars for managed_node3 13355 1727096173.29238: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.29255: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.29258: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.29261: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.35475: done sending task result for task 0afff68d-5257-c514-593f-00000000067a 13355 1727096173.35480: WORKER PROCESS EXITING 13355 1727096173.36761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.38150: done with get_vars() 13355 1727096173.38177: done getting variables 13355 1727096173.38214: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096173.38289: variable 'profile' from source: include params 13355 1727096173.38292: variable 'item' from source: include params 13355 1727096173.38332: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Monday 23 September 2024 08:56:13 -0400 (0:00:00.111) 0:00:22.644 ****** 13355 1727096173.38355: entering _queue_task() for managed_node3/set_fact 13355 1727096173.38723: worker is 1 (out of 1 available) 13355 1727096173.38735: exiting _queue_task() for managed_node3/set_fact 13355 1727096173.38749: done queuing things up, now waiting for results queue to drain 13355 1727096173.38750: waiting for pending results... 13355 1727096173.38971: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 13355 1727096173.39115: in run() - task 0afff68d-5257-c514-593f-00000000067b 13355 1727096173.39135: variable 'ansible_search_path' from source: unknown 13355 1727096173.39144: variable 'ansible_search_path' from source: unknown 13355 1727096173.39188: calling self._execute() 13355 1727096173.39297: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.39312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.39330: variable 'omit' from source: magic vars 13355 1727096173.39732: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.39763: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.39894: variable 'profile_stat' from source: set_fact 13355 1727096173.39913: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096173.39921: when evaluation is False, skipping this task 13355 1727096173.39928: _execute() done 13355 1727096173.39936: dumping result to json 13355 1727096173.39944: done dumping result, returning 13355 1727096173.39958: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0afff68d-5257-c514-593f-00000000067b] 13355 1727096173.39971: sending task result for task 0afff68d-5257-c514-593f-00000000067b skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096173.40133: no more pending results, returning what we have 13355 1727096173.40137: results queue empty 13355 1727096173.40138: checking for any_errors_fatal 13355 1727096173.40146: done checking for any_errors_fatal 13355 1727096173.40147: checking for max_fail_percentage 13355 1727096173.40149: done checking for max_fail_percentage 13355 1727096173.40150: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.40151: done checking to see if all hosts have failed 13355 1727096173.40151: getting the remaining hosts for this loop 13355 1727096173.40153: done getting the remaining hosts for this loop 13355 1727096173.40157: getting the next task for host managed_node3 13355 1727096173.40162: done getting next task for host managed_node3 13355 1727096173.40171: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13355 1727096173.40176: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.40181: getting variables 13355 1727096173.40183: in VariableManager get_vars() 13355 1727096173.40241: Calling all_inventory to load vars for managed_node3 13355 1727096173.40244: Calling groups_inventory to load vars for managed_node3 13355 1727096173.40247: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.40261: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.40265: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.40388: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.41060: done sending task result for task 0afff68d-5257-c514-593f-00000000067b 13355 1727096173.41063: WORKER PROCESS EXITING 13355 1727096173.42279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.44819: done with get_vars() 13355 1727096173.44847: done getting variables 13355 1727096173.45025: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096173.45262: variable 'profile' from source: include params 13355 1727096173.45266: variable 'item' from source: include params 13355 1727096173.45430: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Monday 23 September 2024 08:56:13 -0400 (0:00:00.071) 0:00:22.715 ****** 13355 1727096173.45466: entering _queue_task() for managed_node3/command 13355 1727096173.46216: worker is 1 (out of 1 available) 13355 1727096173.46232: exiting _queue_task() for managed_node3/command 13355 1727096173.46276: done queuing things up, now waiting for results queue to drain 13355 1727096173.46278: waiting for pending results... 13355 1727096173.46543: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 13355 1727096173.46694: in run() - task 0afff68d-5257-c514-593f-00000000067c 13355 1727096173.46721: variable 'ansible_search_path' from source: unknown 13355 1727096173.46730: variable 'ansible_search_path' from source: unknown 13355 1727096173.46787: calling self._execute() 13355 1727096173.46872: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.46876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.47078: variable 'omit' from source: magic vars 13355 1727096173.47287: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.47301: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.47433: variable 'profile_stat' from source: set_fact 13355 1727096173.47447: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096173.47451: when evaluation is False, skipping this task 13355 1727096173.47457: _execute() done 13355 1727096173.47459: dumping result to json 13355 1727096173.47462: done dumping result, returning 13355 1727096173.47466: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0afff68d-5257-c514-593f-00000000067c] 13355 1727096173.47474: sending task result for task 0afff68d-5257-c514-593f-00000000067c 13355 1727096173.47570: done sending task result for task 0afff68d-5257-c514-593f-00000000067c 13355 1727096173.47574: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096173.47628: no more pending results, returning what we have 13355 1727096173.47632: results queue empty 13355 1727096173.47885: checking for any_errors_fatal 13355 1727096173.47891: done checking for any_errors_fatal 13355 1727096173.47892: checking for max_fail_percentage 13355 1727096173.47894: done checking for max_fail_percentage 13355 1727096173.47895: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.47895: done checking to see if all hosts have failed 13355 1727096173.47896: getting the remaining hosts for this loop 13355 1727096173.47897: done getting the remaining hosts for this loop 13355 1727096173.47901: getting the next task for host managed_node3 13355 1727096173.47908: done getting next task for host managed_node3 13355 1727096173.47910: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13355 1727096173.47914: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.47918: getting variables 13355 1727096173.47919: in VariableManager get_vars() 13355 1727096173.47966: Calling all_inventory to load vars for managed_node3 13355 1727096173.47971: Calling groups_inventory to load vars for managed_node3 13355 1727096173.47973: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.47983: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.47986: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.47989: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.50342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.52059: done with get_vars() 13355 1727096173.52093: done getting variables 13355 1727096173.52160: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096173.52280: variable 'profile' from source: include params 13355 1727096173.52284: variable 'item' from source: include params 13355 1727096173.52343: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Monday 23 September 2024 08:56:13 -0400 (0:00:00.069) 0:00:22.784 ****** 13355 1727096173.52380: entering _queue_task() for managed_node3/set_fact 13355 1727096173.52758: worker is 1 (out of 1 available) 13355 1727096173.52975: exiting _queue_task() for managed_node3/set_fact 13355 1727096173.52988: done queuing things up, now waiting for results queue to drain 13355 1727096173.52989: waiting for pending results... 13355 1727096173.53230: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 13355 1727096173.53295: in run() - task 0afff68d-5257-c514-593f-00000000067d 13355 1727096173.53316: variable 'ansible_search_path' from source: unknown 13355 1727096173.53328: variable 'ansible_search_path' from source: unknown 13355 1727096173.53377: calling self._execute() 13355 1727096173.53761: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.53765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.53771: variable 'omit' from source: magic vars 13355 1727096173.54441: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.54462: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.54772: variable 'profile_stat' from source: set_fact 13355 1727096173.54776: Evaluated conditional (profile_stat.stat.exists): False 13355 1727096173.54778: when evaluation is False, skipping this task 13355 1727096173.54781: _execute() done 13355 1727096173.54783: dumping result to json 13355 1727096173.54785: done dumping result, returning 13355 1727096173.54788: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0afff68d-5257-c514-593f-00000000067d] 13355 1727096173.54790: sending task result for task 0afff68d-5257-c514-593f-00000000067d 13355 1727096173.55047: done sending task result for task 0afff68d-5257-c514-593f-00000000067d 13355 1727096173.55050: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13355 1727096173.55105: no more pending results, returning what we have 13355 1727096173.55109: results queue empty 13355 1727096173.55110: checking for any_errors_fatal 13355 1727096173.55116: done checking for any_errors_fatal 13355 1727096173.55117: checking for max_fail_percentage 13355 1727096173.55119: done checking for max_fail_percentage 13355 1727096173.55119: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.55120: done checking to see if all hosts have failed 13355 1727096173.55121: getting the remaining hosts for this loop 13355 1727096173.55122: done getting the remaining hosts for this loop 13355 1727096173.55126: getting the next task for host managed_node3 13355 1727096173.55134: done getting next task for host managed_node3 13355 1727096173.55137: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13355 1727096173.55140: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.55146: getting variables 13355 1727096173.55147: in VariableManager get_vars() 13355 1727096173.55209: Calling all_inventory to load vars for managed_node3 13355 1727096173.55212: Calling groups_inventory to load vars for managed_node3 13355 1727096173.55215: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.55229: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.55232: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.55235: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.56961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.58525: done with get_vars() 13355 1727096173.58561: done getting variables 13355 1727096173.58630: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096173.58758: variable 'profile' from source: include params 13355 1727096173.58762: variable 'item' from source: include params 13355 1727096173.58823: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Monday 23 September 2024 08:56:13 -0400 (0:00:00.064) 0:00:22.849 ****** 13355 1727096173.58859: entering _queue_task() for managed_node3/assert 13355 1727096173.59245: worker is 1 (out of 1 available) 13355 1727096173.59260: exiting _queue_task() for managed_node3/assert 13355 1727096173.59473: done queuing things up, now waiting for results queue to drain 13355 1727096173.59475: waiting for pending results... 13355 1727096173.59560: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 13355 1727096173.59692: in run() - task 0afff68d-5257-c514-593f-000000000364 13355 1727096173.59717: variable 'ansible_search_path' from source: unknown 13355 1727096173.59723: variable 'ansible_search_path' from source: unknown 13355 1727096173.59770: calling self._execute() 13355 1727096173.59878: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.59890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.59905: variable 'omit' from source: magic vars 13355 1727096173.60305: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.60354: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.60359: variable 'omit' from source: magic vars 13355 1727096173.60390: variable 'omit' from source: magic vars 13355 1727096173.60499: variable 'profile' from source: include params 13355 1727096173.60509: variable 'item' from source: include params 13355 1727096173.60675: variable 'item' from source: include params 13355 1727096173.60680: variable 'omit' from source: magic vars 13355 1727096173.60682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096173.60694: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096173.60718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096173.60740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.60760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.60803: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096173.60811: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.60819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.60929: Set connection var ansible_shell_executable to /bin/sh 13355 1727096173.60940: Set connection var ansible_shell_type to sh 13355 1727096173.60949: Set connection var ansible_pipelining to False 13355 1727096173.60961: Set connection var ansible_connection to ssh 13355 1727096173.60973: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096173.60982: Set connection var ansible_timeout to 10 13355 1727096173.61015: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.61022: variable 'ansible_connection' from source: unknown 13355 1727096173.61029: variable 'ansible_module_compression' from source: unknown 13355 1727096173.61036: variable 'ansible_shell_type' from source: unknown 13355 1727096173.61042: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.61048: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.61108: variable 'ansible_pipelining' from source: unknown 13355 1727096173.61112: variable 'ansible_timeout' from source: unknown 13355 1727096173.61115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.61314: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096173.61337: variable 'omit' from source: magic vars 13355 1727096173.61348: starting attempt loop 13355 1727096173.61357: running the handler 13355 1727096173.61485: variable 'lsr_net_profile_exists' from source: set_fact 13355 1727096173.61495: Evaluated conditional (lsr_net_profile_exists): True 13355 1727096173.61505: handler run complete 13355 1727096173.61522: attempt loop complete, returning result 13355 1727096173.61528: _execute() done 13355 1727096173.61541: dumping result to json 13355 1727096173.61655: done dumping result, returning 13355 1727096173.61659: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [0afff68d-5257-c514-593f-000000000364] 13355 1727096173.61661: sending task result for task 0afff68d-5257-c514-593f-000000000364 13355 1727096173.61733: done sending task result for task 0afff68d-5257-c514-593f-000000000364 13355 1727096173.61740: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096173.61807: no more pending results, returning what we have 13355 1727096173.61811: results queue empty 13355 1727096173.61812: checking for any_errors_fatal 13355 1727096173.61821: done checking for any_errors_fatal 13355 1727096173.61822: checking for max_fail_percentage 13355 1727096173.61824: done checking for max_fail_percentage 13355 1727096173.61825: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.61826: done checking to see if all hosts have failed 13355 1727096173.61826: getting the remaining hosts for this loop 13355 1727096173.61828: done getting the remaining hosts for this loop 13355 1727096173.61833: getting the next task for host managed_node3 13355 1727096173.61842: done getting next task for host managed_node3 13355 1727096173.61845: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13355 1727096173.61848: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.61856: getting variables 13355 1727096173.61857: in VariableManager get_vars() 13355 1727096173.61924: Calling all_inventory to load vars for managed_node3 13355 1727096173.61926: Calling groups_inventory to load vars for managed_node3 13355 1727096173.61929: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.61941: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.61944: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.61947: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.63603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.66443: done with get_vars() 13355 1727096173.66479: done getting variables 13355 1727096173.66533: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096173.66862: variable 'profile' from source: include params 13355 1727096173.66867: variable 'item' from source: include params 13355 1727096173.67148: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Monday 23 September 2024 08:56:13 -0400 (0:00:00.083) 0:00:22.932 ****** 13355 1727096173.67193: entering _queue_task() for managed_node3/assert 13355 1727096173.68080: worker is 1 (out of 1 available) 13355 1727096173.68094: exiting _queue_task() for managed_node3/assert 13355 1727096173.68107: done queuing things up, now waiting for results queue to drain 13355 1727096173.68108: waiting for pending results... 13355 1727096173.68603: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 13355 1727096173.68970: in run() - task 0afff68d-5257-c514-593f-000000000365 13355 1727096173.68975: variable 'ansible_search_path' from source: unknown 13355 1727096173.68978: variable 'ansible_search_path' from source: unknown 13355 1727096173.68981: calling self._execute() 13355 1727096173.69107: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.69111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.69121: variable 'omit' from source: magic vars 13355 1727096173.69609: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.69627: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.69642: variable 'omit' from source: magic vars 13355 1727096173.69694: variable 'omit' from source: magic vars 13355 1727096173.69811: variable 'profile' from source: include params 13355 1727096173.69821: variable 'item' from source: include params 13355 1727096173.69897: variable 'item' from source: include params 13355 1727096173.69922: variable 'omit' from source: magic vars 13355 1727096173.69976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096173.70016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096173.70040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096173.70174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.70178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.70180: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096173.70183: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.70185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.70238: Set connection var ansible_shell_executable to /bin/sh 13355 1727096173.70249: Set connection var ansible_shell_type to sh 13355 1727096173.70261: Set connection var ansible_pipelining to False 13355 1727096173.70273: Set connection var ansible_connection to ssh 13355 1727096173.70289: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096173.70299: Set connection var ansible_timeout to 10 13355 1727096173.70327: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.70334: variable 'ansible_connection' from source: unknown 13355 1727096173.70340: variable 'ansible_module_compression' from source: unknown 13355 1727096173.70346: variable 'ansible_shell_type' from source: unknown 13355 1727096173.70352: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.70361: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.70370: variable 'ansible_pipelining' from source: unknown 13355 1727096173.70377: variable 'ansible_timeout' from source: unknown 13355 1727096173.70387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.70534: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096173.70551: variable 'omit' from source: magic vars 13355 1727096173.70565: starting attempt loop 13355 1727096173.70575: running the handler 13355 1727096173.70717: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13355 1727096173.70721: Evaluated conditional (lsr_net_profile_ansible_managed): True 13355 1727096173.70726: handler run complete 13355 1727096173.70745: attempt loop complete, returning result 13355 1727096173.70826: _execute() done 13355 1727096173.70829: dumping result to json 13355 1727096173.70832: done dumping result, returning 13355 1727096173.70835: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0afff68d-5257-c514-593f-000000000365] 13355 1727096173.70837: sending task result for task 0afff68d-5257-c514-593f-000000000365 13355 1727096173.70914: done sending task result for task 0afff68d-5257-c514-593f-000000000365 13355 1727096173.70917: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096173.70985: no more pending results, returning what we have 13355 1727096173.70990: results queue empty 13355 1727096173.70991: checking for any_errors_fatal 13355 1727096173.70999: done checking for any_errors_fatal 13355 1727096173.71000: checking for max_fail_percentage 13355 1727096173.71001: done checking for max_fail_percentage 13355 1727096173.71002: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.71003: done checking to see if all hosts have failed 13355 1727096173.71003: getting the remaining hosts for this loop 13355 1727096173.71005: done getting the remaining hosts for this loop 13355 1727096173.71008: getting the next task for host managed_node3 13355 1727096173.71014: done getting next task for host managed_node3 13355 1727096173.71016: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13355 1727096173.71019: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.71023: getting variables 13355 1727096173.71024: in VariableManager get_vars() 13355 1727096173.71084: Calling all_inventory to load vars for managed_node3 13355 1727096173.71086: Calling groups_inventory to load vars for managed_node3 13355 1727096173.71089: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.71100: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.71103: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.71105: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.72971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.74945: done with get_vars() 13355 1727096173.74980: done getting variables 13355 1727096173.75038: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096173.75157: variable 'profile' from source: include params 13355 1727096173.75161: variable 'item' from source: include params 13355 1727096173.75217: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Monday 23 September 2024 08:56:13 -0400 (0:00:00.080) 0:00:23.013 ****** 13355 1727096173.75251: entering _queue_task() for managed_node3/assert 13355 1727096173.75621: worker is 1 (out of 1 available) 13355 1727096173.75634: exiting _queue_task() for managed_node3/assert 13355 1727096173.75647: done queuing things up, now waiting for results queue to drain 13355 1727096173.75648: waiting for pending results... 13355 1727096173.75990: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 13355 1727096173.76077: in run() - task 0afff68d-5257-c514-593f-000000000366 13355 1727096173.76098: variable 'ansible_search_path' from source: unknown 13355 1727096173.76106: variable 'ansible_search_path' from source: unknown 13355 1727096173.76150: calling self._execute() 13355 1727096173.76264: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.76279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.76297: variable 'omit' from source: magic vars 13355 1727096173.76700: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.76757: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.76772: variable 'omit' from source: magic vars 13355 1727096173.76821: variable 'omit' from source: magic vars 13355 1727096173.77172: variable 'profile' from source: include params 13355 1727096173.77176: variable 'item' from source: include params 13355 1727096173.77178: variable 'item' from source: include params 13355 1727096173.77181: variable 'omit' from source: magic vars 13355 1727096173.77183: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096173.77186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096173.77188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096173.77300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.77416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.77419: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096173.77421: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.77423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.77483: Set connection var ansible_shell_executable to /bin/sh 13355 1727096173.77495: Set connection var ansible_shell_type to sh 13355 1727096173.77506: Set connection var ansible_pipelining to False 13355 1727096173.77515: Set connection var ansible_connection to ssh 13355 1727096173.77530: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096173.77541: Set connection var ansible_timeout to 10 13355 1727096173.77575: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.77582: variable 'ansible_connection' from source: unknown 13355 1727096173.77588: variable 'ansible_module_compression' from source: unknown 13355 1727096173.77593: variable 'ansible_shell_type' from source: unknown 13355 1727096173.77598: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.77603: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.77608: variable 'ansible_pipelining' from source: unknown 13355 1727096173.77613: variable 'ansible_timeout' from source: unknown 13355 1727096173.77619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.77760: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096173.77780: variable 'omit' from source: magic vars 13355 1727096173.77791: starting attempt loop 13355 1727096173.77799: running the handler 13355 1727096173.77919: variable 'lsr_net_profile_fingerprint' from source: set_fact 13355 1727096173.77930: Evaluated conditional (lsr_net_profile_fingerprint): True 13355 1727096173.77960: handler run complete 13355 1727096173.77965: attempt loop complete, returning result 13355 1727096173.77975: _execute() done 13355 1727096173.77983: dumping result to json 13355 1727096173.78070: done dumping result, returning 13355 1727096173.78073: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [0afff68d-5257-c514-593f-000000000366] 13355 1727096173.78075: sending task result for task 0afff68d-5257-c514-593f-000000000366 13355 1727096173.78143: done sending task result for task 0afff68d-5257-c514-593f-000000000366 13355 1727096173.78146: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096173.78217: no more pending results, returning what we have 13355 1727096173.78221: results queue empty 13355 1727096173.78222: checking for any_errors_fatal 13355 1727096173.78230: done checking for any_errors_fatal 13355 1727096173.78231: checking for max_fail_percentage 13355 1727096173.78233: done checking for max_fail_percentage 13355 1727096173.78234: checking to see if all hosts have failed and the running result is not ok 13355 1727096173.78234: done checking to see if all hosts have failed 13355 1727096173.78235: getting the remaining hosts for this loop 13355 1727096173.78237: done getting the remaining hosts for this loop 13355 1727096173.78240: getting the next task for host managed_node3 13355 1727096173.78250: done getting next task for host managed_node3 13355 1727096173.78256: ^ task is: TASK: ** TEST check polling interval 13355 1727096173.78259: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096173.78264: getting variables 13355 1727096173.78265: in VariableManager get_vars() 13355 1727096173.78325: Calling all_inventory to load vars for managed_node3 13355 1727096173.78328: Calling groups_inventory to load vars for managed_node3 13355 1727096173.78331: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096173.78343: Calling all_plugins_play to load vars for managed_node3 13355 1727096173.78346: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096173.78349: Calling groups_plugins_play to load vars for managed_node3 13355 1727096173.79937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096173.81641: done with get_vars() 13355 1727096173.81667: done getting variables 13355 1727096173.81727: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:75 Monday 23 September 2024 08:56:13 -0400 (0:00:00.065) 0:00:23.078 ****** 13355 1727096173.81761: entering _queue_task() for managed_node3/command 13355 1727096173.82295: worker is 1 (out of 1 available) 13355 1727096173.82306: exiting _queue_task() for managed_node3/command 13355 1727096173.82317: done queuing things up, now waiting for results queue to drain 13355 1727096173.82319: waiting for pending results... 13355 1727096173.82448: running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval 13355 1727096173.82545: in run() - task 0afff68d-5257-c514-593f-000000000071 13355 1727096173.82549: variable 'ansible_search_path' from source: unknown 13355 1727096173.82586: calling self._execute() 13355 1727096173.82697: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.82761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.82765: variable 'omit' from source: magic vars 13355 1727096173.83116: variable 'ansible_distribution_major_version' from source: facts 13355 1727096173.83133: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096173.83142: variable 'omit' from source: magic vars 13355 1727096173.83169: variable 'omit' from source: magic vars 13355 1727096173.83273: variable 'controller_device' from source: play vars 13355 1727096173.83299: variable 'omit' from source: magic vars 13355 1727096173.83345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096173.83390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096173.83419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096173.83473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.83477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096173.83498: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096173.83505: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.83516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.83670: Set connection var ansible_shell_executable to /bin/sh 13355 1727096173.83677: Set connection var ansible_shell_type to sh 13355 1727096173.83681: Set connection var ansible_pipelining to False 13355 1727096173.83683: Set connection var ansible_connection to ssh 13355 1727096173.83685: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096173.83687: Set connection var ansible_timeout to 10 13355 1727096173.83693: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.83700: variable 'ansible_connection' from source: unknown 13355 1727096173.83707: variable 'ansible_module_compression' from source: unknown 13355 1727096173.83713: variable 'ansible_shell_type' from source: unknown 13355 1727096173.83719: variable 'ansible_shell_executable' from source: unknown 13355 1727096173.83725: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096173.83732: variable 'ansible_pipelining' from source: unknown 13355 1727096173.83738: variable 'ansible_timeout' from source: unknown 13355 1727096173.83746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096173.83900: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096173.84003: variable 'omit' from source: magic vars 13355 1727096173.84006: starting attempt loop 13355 1727096173.84009: running the handler 13355 1727096173.84011: _low_level_execute_command(): starting 13355 1727096173.84013: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096173.84713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096173.84725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096173.84780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096173.84848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096173.84882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096173.84904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096173.85181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096173.86798: stdout chunk (state=3): >>>/root <<< 13355 1727096173.87059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096173.87063: stdout chunk (state=3): >>><<< 13355 1727096173.87065: stderr chunk (state=3): >>><<< 13355 1727096173.87092: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096173.87114: _low_level_execute_command(): starting 13355 1727096173.87361: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674 `" && echo ansible-tmp-1727096173.8709896-14416-78325922087674="` echo /root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674 `" ) && sleep 0' 13355 1727096173.88520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096173.88887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096173.88919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096173.88993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096173.91113: stdout chunk (state=3): >>>ansible-tmp-1727096173.8709896-14416-78325922087674=/root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674 <<< 13355 1727096173.91192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096173.91204: stdout chunk (state=3): >>><<< 13355 1727096173.91217: stderr chunk (state=3): >>><<< 13355 1727096173.91241: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096173.8709896-14416-78325922087674=/root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096173.91290: variable 'ansible_module_compression' from source: unknown 13355 1727096173.91346: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096173.91447: variable 'ansible_facts' from source: unknown 13355 1727096173.91656: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/AnsiballZ_command.py 13355 1727096173.92015: Sending initial data 13355 1727096173.92043: Sent initial data (155 bytes) 13355 1727096173.93540: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096173.93636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096173.95334: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13355 1727096173.95349: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13355 1727096173.95364: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13355 1727096173.95405: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096173.95485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096173.95548: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmptrs7d1ig /root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/AnsiballZ_command.py <<< 13355 1727096173.95563: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/AnsiballZ_command.py" <<< 13355 1727096173.95590: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmptrs7d1ig" to remote "/root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/AnsiballZ_command.py" <<< 13355 1727096173.96865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096173.96923: stderr chunk (state=3): >>><<< 13355 1727096173.97126: stdout chunk (state=3): >>><<< 13355 1727096173.97129: done transferring module to remote 13355 1727096173.97131: _low_level_execute_command(): starting 13355 1727096173.97134: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/ /root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/AnsiballZ_command.py && sleep 0' 13355 1727096173.98249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096173.98256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096173.98259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096173.98375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096173.98474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096173.98540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.00775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.00780: stdout chunk (state=3): >>><<< 13355 1727096174.00782: stderr chunk (state=3): >>><<< 13355 1727096174.00785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.00792: _low_level_execute_command(): starting 13355 1727096174.00795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/AnsiballZ_command.py && sleep 0' 13355 1727096174.01715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.01734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096174.01784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096174.01794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.01841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096174.01964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.01979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.02084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.18434: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-23 08:56:14.177840", "end": "2024-09-23 08:56:14.181418", "delta": "0:00:00.003578", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096174.20222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096174.20227: stdout chunk (state=3): >>><<< 13355 1727096174.20233: stderr chunk (state=3): >>><<< 13355 1727096174.20258: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-23 08:56:14.177840", "end": "2024-09-23 08:56:14.181418", "delta": "0:00:00.003578", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096174.20307: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096174.20315: _low_level_execute_command(): starting 13355 1727096174.20320: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096173.8709896-14416-78325922087674/ > /dev/null 2>&1 && sleep 0' 13355 1727096174.20974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096174.20992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.21074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.21078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096174.21080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096174.21083: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096174.21085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.21087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096174.21089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096174.21091: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096174.21146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.21176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096174.21188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.21204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.21269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.23352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.23359: stdout chunk (state=3): >>><<< 13355 1727096174.23362: stderr chunk (state=3): >>><<< 13355 1727096174.23385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.23394: handler run complete 13355 1727096174.23416: Evaluated conditional (False): False 13355 1727096174.23671: variable 'result' from source: unknown 13355 1727096174.23690: Evaluated conditional ('110' in result.stdout): True 13355 1727096174.23706: attempt loop complete, returning result 13355 1727096174.23709: _execute() done 13355 1727096174.23711: dumping result to json 13355 1727096174.23713: done dumping result, returning 13355 1727096174.23737: done running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval [0afff68d-5257-c514-593f-000000000071] 13355 1727096174.23740: sending task result for task 0afff68d-5257-c514-593f-000000000071 13355 1727096174.23889: done sending task result for task 0afff68d-5257-c514-593f-000000000071 13355 1727096174.23891: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003578", "end": "2024-09-23 08:56:14.181418", "rc": 0, "start": "2024-09-23 08:56:14.177840" } STDOUT: MII Polling Interval (ms): 110 13355 1727096174.23982: no more pending results, returning what we have 13355 1727096174.23986: results queue empty 13355 1727096174.23986: checking for any_errors_fatal 13355 1727096174.23995: done checking for any_errors_fatal 13355 1727096174.23995: checking for max_fail_percentage 13355 1727096174.23997: done checking for max_fail_percentage 13355 1727096174.23998: checking to see if all hosts have failed and the running result is not ok 13355 1727096174.23998: done checking to see if all hosts have failed 13355 1727096174.23999: getting the remaining hosts for this loop 13355 1727096174.24001: done getting the remaining hosts for this loop 13355 1727096174.24005: getting the next task for host managed_node3 13355 1727096174.24010: done getting next task for host managed_node3 13355 1727096174.24013: ^ task is: TASK: ** TEST check IPv4 13355 1727096174.24015: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096174.24018: getting variables 13355 1727096174.24019: in VariableManager get_vars() 13355 1727096174.24235: Calling all_inventory to load vars for managed_node3 13355 1727096174.24239: Calling groups_inventory to load vars for managed_node3 13355 1727096174.24242: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096174.24252: Calling all_plugins_play to load vars for managed_node3 13355 1727096174.24256: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096174.24259: Calling groups_plugins_play to load vars for managed_node3 13355 1727096174.26069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096174.29138: done with get_vars() 13355 1727096174.29174: done getting variables 13355 1727096174.29263: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:80 Monday 23 September 2024 08:56:14 -0400 (0:00:00.475) 0:00:23.553 ****** 13355 1727096174.29295: entering _queue_task() for managed_node3/command 13355 1727096174.29693: worker is 1 (out of 1 available) 13355 1727096174.29706: exiting _queue_task() for managed_node3/command 13355 1727096174.29718: done queuing things up, now waiting for results queue to drain 13355 1727096174.29719: waiting for pending results... 13355 1727096174.30186: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 13355 1727096174.30192: in run() - task 0afff68d-5257-c514-593f-000000000072 13355 1727096174.30195: variable 'ansible_search_path' from source: unknown 13355 1727096174.30198: calling self._execute() 13355 1727096174.30598: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096174.30602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096174.30610: variable 'omit' from source: magic vars 13355 1727096174.31170: variable 'ansible_distribution_major_version' from source: facts 13355 1727096174.31174: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096174.31177: variable 'omit' from source: magic vars 13355 1727096174.31179: variable 'omit' from source: magic vars 13355 1727096174.31294: variable 'controller_device' from source: play vars 13355 1727096174.31313: variable 'omit' from source: magic vars 13355 1727096174.31473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096174.31512: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096174.31532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096174.31664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096174.31680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096174.31712: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096174.31715: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096174.31718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096174.31935: Set connection var ansible_shell_executable to /bin/sh 13355 1727096174.31941: Set connection var ansible_shell_type to sh 13355 1727096174.31947: Set connection var ansible_pipelining to False 13355 1727096174.31952: Set connection var ansible_connection to ssh 13355 1727096174.31957: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096174.31963: Set connection var ansible_timeout to 10 13355 1727096174.32117: variable 'ansible_shell_executable' from source: unknown 13355 1727096174.32121: variable 'ansible_connection' from source: unknown 13355 1727096174.32124: variable 'ansible_module_compression' from source: unknown 13355 1727096174.32127: variable 'ansible_shell_type' from source: unknown 13355 1727096174.32130: variable 'ansible_shell_executable' from source: unknown 13355 1727096174.32132: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096174.32134: variable 'ansible_pipelining' from source: unknown 13355 1727096174.32136: variable 'ansible_timeout' from source: unknown 13355 1727096174.32173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096174.32465: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096174.32476: variable 'omit' from source: magic vars 13355 1727096174.32481: starting attempt loop 13355 1727096174.32485: running the handler 13355 1727096174.32502: _low_level_execute_command(): starting 13355 1727096174.32510: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096174.33392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.33447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096174.33489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.33520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.35475: stdout chunk (state=3): >>>/root <<< 13355 1727096174.35560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.35564: stdout chunk (state=3): >>><<< 13355 1727096174.35566: stderr chunk (state=3): >>><<< 13355 1727096174.35591: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.35689: _low_level_execute_command(): starting 13355 1727096174.35693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407 `" && echo ansible-tmp-1727096174.355977-14436-255079085085407="` echo /root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407 `" ) && sleep 0' 13355 1727096174.36228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096174.36243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.36256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.36277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096174.36384: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.36408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.36474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.38481: stdout chunk (state=3): >>>ansible-tmp-1727096174.355977-14436-255079085085407=/root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407 <<< 13355 1727096174.38644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.38648: stdout chunk (state=3): >>><<< 13355 1727096174.38651: stderr chunk (state=3): >>><<< 13355 1727096174.38674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096174.355977-14436-255079085085407=/root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.38873: variable 'ansible_module_compression' from source: unknown 13355 1727096174.38877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096174.38879: variable 'ansible_facts' from source: unknown 13355 1727096174.38904: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/AnsiballZ_command.py 13355 1727096174.39091: Sending initial data 13355 1727096174.39106: Sent initial data (155 bytes) 13355 1727096174.39684: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096174.39699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.39714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.39731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096174.39765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.39783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.39881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.39909: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.39973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.41665: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096174.41751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096174.41754: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpt8h4fhr7 /root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/AnsiballZ_command.py <<< 13355 1727096174.41802: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/AnsiballZ_command.py" <<< 13355 1727096174.41806: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpt8h4fhr7" to remote "/root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/AnsiballZ_command.py" <<< 13355 1727096174.42456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.42588: stderr chunk (state=3): >>><<< 13355 1727096174.42591: stdout chunk (state=3): >>><<< 13355 1727096174.42593: done transferring module to remote 13355 1727096174.42595: _low_level_execute_command(): starting 13355 1727096174.42597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/ /root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/AnsiballZ_command.py && sleep 0' 13355 1727096174.43137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096174.43152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.43168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.43186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096174.43223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.43306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.43357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.43398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.45312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.45321: stderr chunk (state=3): >>><<< 13355 1727096174.45324: stdout chunk (state=3): >>><<< 13355 1727096174.45425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.45429: _low_level_execute_command(): starting 13355 1727096174.45433: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/AnsiballZ_command.py && sleep 0' 13355 1727096174.45992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096174.46009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.46026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.46045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096174.46082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.46100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096174.46180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096174.46200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.46217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.46308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.62602: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.223/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-23 08:56:14.618848", "end": "2024-09-23 08:56:14.622674", "delta": "0:00:00.003826", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096174.64178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096174.64202: stderr chunk (state=3): >>><<< 13355 1727096174.64208: stdout chunk (state=3): >>><<< 13355 1727096174.64225: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.223/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-23 08:56:14.618848", "end": "2024-09-23 08:56:14.622674", "delta": "0:00:00.003826", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096174.64258: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096174.64271: _low_level_execute_command(): starting 13355 1727096174.64278: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096174.355977-14436-255079085085407/ > /dev/null 2>&1 && sleep 0' 13355 1727096174.65151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096174.65186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.65189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.65413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.67146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.67180: stderr chunk (state=3): >>><<< 13355 1727096174.67184: stdout chunk (state=3): >>><<< 13355 1727096174.67213: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.67217: handler run complete 13355 1727096174.67227: Evaluated conditional (False): False 13355 1727096174.67341: variable 'result' from source: set_fact 13355 1727096174.67354: Evaluated conditional ('192.0.2' in result.stdout): True 13355 1727096174.67366: attempt loop complete, returning result 13355 1727096174.67371: _execute() done 13355 1727096174.67374: dumping result to json 13355 1727096174.67378: done dumping result, returning 13355 1727096174.67386: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [0afff68d-5257-c514-593f-000000000072] 13355 1727096174.67391: sending task result for task 0afff68d-5257-c514-593f-000000000072 13355 1727096174.67485: done sending task result for task 0afff68d-5257-c514-593f-000000000072 13355 1727096174.67488: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003826", "end": "2024-09-23 08:56:14.622674", "rc": 0, "start": "2024-09-23 08:56:14.618848" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.223/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 236sec preferred_lft 236sec 13355 1727096174.67561: no more pending results, returning what we have 13355 1727096174.67565: results queue empty 13355 1727096174.67565: checking for any_errors_fatal 13355 1727096174.67576: done checking for any_errors_fatal 13355 1727096174.67577: checking for max_fail_percentage 13355 1727096174.67579: done checking for max_fail_percentage 13355 1727096174.67579: checking to see if all hosts have failed and the running result is not ok 13355 1727096174.67580: done checking to see if all hosts have failed 13355 1727096174.67581: getting the remaining hosts for this loop 13355 1727096174.67582: done getting the remaining hosts for this loop 13355 1727096174.67585: getting the next task for host managed_node3 13355 1727096174.67591: done getting next task for host managed_node3 13355 1727096174.67593: ^ task is: TASK: ** TEST check IPv6 13355 1727096174.67596: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096174.67606: getting variables 13355 1727096174.67607: in VariableManager get_vars() 13355 1727096174.67657: Calling all_inventory to load vars for managed_node3 13355 1727096174.67660: Calling groups_inventory to load vars for managed_node3 13355 1727096174.67662: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096174.67673: Calling all_plugins_play to load vars for managed_node3 13355 1727096174.67676: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096174.67679: Calling groups_plugins_play to load vars for managed_node3 13355 1727096174.70507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096174.73481: done with get_vars() 13355 1727096174.73516: done getting variables 13355 1727096174.73591: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:87 Monday 23 September 2024 08:56:14 -0400 (0:00:00.443) 0:00:23.996 ****** 13355 1727096174.73622: entering _queue_task() for managed_node3/command 13355 1727096174.74136: worker is 1 (out of 1 available) 13355 1727096174.74148: exiting _queue_task() for managed_node3/command 13355 1727096174.74161: done queuing things up, now waiting for results queue to drain 13355 1727096174.74162: waiting for pending results... 13355 1727096174.74474: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 13355 1727096174.74481: in run() - task 0afff68d-5257-c514-593f-000000000073 13355 1727096174.74571: variable 'ansible_search_path' from source: unknown 13355 1727096174.74576: calling self._execute() 13355 1727096174.74643: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096174.74654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096174.74673: variable 'omit' from source: magic vars 13355 1727096174.75101: variable 'ansible_distribution_major_version' from source: facts 13355 1727096174.75127: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096174.75141: variable 'omit' from source: magic vars 13355 1727096174.75171: variable 'omit' from source: magic vars 13355 1727096174.75287: variable 'controller_device' from source: play vars 13355 1727096174.75310: variable 'omit' from source: magic vars 13355 1727096174.75440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096174.75445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096174.75447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096174.75471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096174.75489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096174.75523: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096174.75530: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096174.75537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096174.75657: Set connection var ansible_shell_executable to /bin/sh 13355 1727096174.75672: Set connection var ansible_shell_type to sh 13355 1727096174.75682: Set connection var ansible_pipelining to False 13355 1727096174.75690: Set connection var ansible_connection to ssh 13355 1727096174.75699: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096174.75708: Set connection var ansible_timeout to 10 13355 1727096174.75737: variable 'ansible_shell_executable' from source: unknown 13355 1727096174.75746: variable 'ansible_connection' from source: unknown 13355 1727096174.75765: variable 'ansible_module_compression' from source: unknown 13355 1727096174.75770: variable 'ansible_shell_type' from source: unknown 13355 1727096174.75772: variable 'ansible_shell_executable' from source: unknown 13355 1727096174.75876: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096174.75879: variable 'ansible_pipelining' from source: unknown 13355 1727096174.75881: variable 'ansible_timeout' from source: unknown 13355 1727096174.75883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096174.76108: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096174.76127: variable 'omit' from source: magic vars 13355 1727096174.76138: starting attempt loop 13355 1727096174.76144: running the handler 13355 1727096174.76398: _low_level_execute_command(): starting 13355 1727096174.76402: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096174.77201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.77300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.77395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.77464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.79181: stdout chunk (state=3): >>>/root <<< 13355 1727096174.79476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.79480: stdout chunk (state=3): >>><<< 13355 1727096174.79483: stderr chunk (state=3): >>><<< 13355 1727096174.79487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.79489: _low_level_execute_command(): starting 13355 1727096174.79492: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868 `" && echo ansible-tmp-1727096174.7935524-14462-240316431657868="` echo /root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868 `" ) && sleep 0' 13355 1727096174.79993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.80086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.80101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096174.80112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.80131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.80203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.82246: stdout chunk (state=3): >>>ansible-tmp-1727096174.7935524-14462-240316431657868=/root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868 <<< 13355 1727096174.82419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.82423: stdout chunk (state=3): >>><<< 13355 1727096174.82426: stderr chunk (state=3): >>><<< 13355 1727096174.82473: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096174.7935524-14462-240316431657868=/root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.82491: variable 'ansible_module_compression' from source: unknown 13355 1727096174.82560: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096174.82605: variable 'ansible_facts' from source: unknown 13355 1727096174.82711: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/AnsiballZ_command.py 13355 1727096174.82990: Sending initial data 13355 1727096174.82994: Sent initial data (156 bytes) 13355 1727096174.83643: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.83697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.83780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.85464: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096174.85539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096174.85576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpfwcpm1t9 /root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/AnsiballZ_command.py <<< 13355 1727096174.85603: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/AnsiballZ_command.py" <<< 13355 1727096174.85636: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13355 1727096174.85675: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpfwcpm1t9" to remote "/root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/AnsiballZ_command.py" <<< 13355 1727096174.86469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.86627: stdout chunk (state=3): >>><<< 13355 1727096174.86632: stderr chunk (state=3): >>><<< 13355 1727096174.86635: done transferring module to remote 13355 1727096174.86637: _low_level_execute_command(): starting 13355 1727096174.86640: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/ /root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/AnsiballZ_command.py && sleep 0' 13355 1727096174.87319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.87380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096174.87400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.87435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.87523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096174.89479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096174.89501: stdout chunk (state=3): >>><<< 13355 1727096174.89514: stderr chunk (state=3): >>><<< 13355 1727096174.89574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096174.89583: _low_level_execute_command(): starting 13355 1727096174.89586: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/AnsiballZ_command.py && sleep 0' 13355 1727096174.90235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096174.90255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096174.90272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096174.90291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096174.90309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096174.90336: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.90384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096174.90455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096174.90477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096174.90510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096174.90590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096175.06955: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::f5/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::f871:61ff:fe85:c3c6/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::f871:61ff:fe85:c3c6/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-23 08:56:15.062798", "end": "2024-09-23 08:56:15.066585", "delta": "0:00:00.003787", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096175.08624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096175.08650: stderr chunk (state=3): >>><<< 13355 1727096175.08656: stdout chunk (state=3): >>><<< 13355 1727096175.08673: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::f5/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::f871:61ff:fe85:c3c6/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::f871:61ff:fe85:c3c6/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-23 08:56:15.062798", "end": "2024-09-23 08:56:15.066585", "delta": "0:00:00.003787", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096175.08706: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096175.08713: _low_level_execute_command(): starting 13355 1727096175.08718: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096174.7935524-14462-240316431657868/ > /dev/null 2>&1 && sleep 0' 13355 1727096175.09177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096175.09181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096175.09184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096175.09186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096175.09239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096175.09242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096175.09250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096175.09284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096175.11171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096175.11201: stderr chunk (state=3): >>><<< 13355 1727096175.11204: stdout chunk (state=3): >>><<< 13355 1727096175.11218: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096175.11224: handler run complete 13355 1727096175.11243: Evaluated conditional (False): False 13355 1727096175.11361: variable 'result' from source: set_fact 13355 1727096175.11377: Evaluated conditional ('2001' in result.stdout): True 13355 1727096175.11386: attempt loop complete, returning result 13355 1727096175.11388: _execute() done 13355 1727096175.11393: dumping result to json 13355 1727096175.11402: done dumping result, returning 13355 1727096175.11405: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [0afff68d-5257-c514-593f-000000000073] 13355 1727096175.11410: sending task result for task 0afff68d-5257-c514-593f-000000000073 13355 1727096175.11508: done sending task result for task 0afff68d-5257-c514-593f-000000000073 13355 1727096175.11511: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003787", "end": "2024-09-23 08:56:15.066585", "rc": 0, "start": "2024-09-23 08:56:15.062798" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::f5/128 scope global dynamic noprefixroute valid_lft 235sec preferred_lft 235sec inet6 2001:db8::f871:61ff:fe85:c3c6/64 scope global dynamic noprefixroute valid_lft 1795sec preferred_lft 1795sec inet6 fe80::f871:61ff:fe85:c3c6/64 scope link noprefixroute valid_lft forever preferred_lft forever 13355 1727096175.11586: no more pending results, returning what we have 13355 1727096175.11589: results queue empty 13355 1727096175.11590: checking for any_errors_fatal 13355 1727096175.11598: done checking for any_errors_fatal 13355 1727096175.11599: checking for max_fail_percentage 13355 1727096175.11600: done checking for max_fail_percentage 13355 1727096175.11601: checking to see if all hosts have failed and the running result is not ok 13355 1727096175.11601: done checking to see if all hosts have failed 13355 1727096175.11602: getting the remaining hosts for this loop 13355 1727096175.11604: done getting the remaining hosts for this loop 13355 1727096175.11607: getting the next task for host managed_node3 13355 1727096175.11614: done getting next task for host managed_node3 13355 1727096175.11619: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096175.11621: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096175.11638: getting variables 13355 1727096175.11640: in VariableManager get_vars() 13355 1727096175.11695: Calling all_inventory to load vars for managed_node3 13355 1727096175.11698: Calling groups_inventory to load vars for managed_node3 13355 1727096175.11700: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096175.11709: Calling all_plugins_play to load vars for managed_node3 13355 1727096175.11711: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096175.11713: Calling groups_plugins_play to load vars for managed_node3 13355 1727096175.12521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096175.13395: done with get_vars() 13355 1727096175.13420: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:56:15 -0400 (0:00:00.398) 0:00:24.395 ****** 13355 1727096175.13499: entering _queue_task() for managed_node3/include_tasks 13355 1727096175.13771: worker is 1 (out of 1 available) 13355 1727096175.13784: exiting _queue_task() for managed_node3/include_tasks 13355 1727096175.13797: done queuing things up, now waiting for results queue to drain 13355 1727096175.13798: waiting for pending results... 13355 1727096175.13974: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096175.14068: in run() - task 0afff68d-5257-c514-593f-00000000007b 13355 1727096175.14081: variable 'ansible_search_path' from source: unknown 13355 1727096175.14085: variable 'ansible_search_path' from source: unknown 13355 1727096175.14114: calling self._execute() 13355 1727096175.14191: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096175.14195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096175.14203: variable 'omit' from source: magic vars 13355 1727096175.14478: variable 'ansible_distribution_major_version' from source: facts 13355 1727096175.14488: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096175.14494: _execute() done 13355 1727096175.14497: dumping result to json 13355 1727096175.14500: done dumping result, returning 13355 1727096175.14507: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-c514-593f-00000000007b] 13355 1727096175.14512: sending task result for task 0afff68d-5257-c514-593f-00000000007b 13355 1727096175.14598: done sending task result for task 0afff68d-5257-c514-593f-00000000007b 13355 1727096175.14600: WORKER PROCESS EXITING 13355 1727096175.14643: no more pending results, returning what we have 13355 1727096175.14647: in VariableManager get_vars() 13355 1727096175.14708: Calling all_inventory to load vars for managed_node3 13355 1727096175.14712: Calling groups_inventory to load vars for managed_node3 13355 1727096175.14714: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096175.14726: Calling all_plugins_play to load vars for managed_node3 13355 1727096175.14729: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096175.14731: Calling groups_plugins_play to load vars for managed_node3 13355 1727096175.15688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096175.16543: done with get_vars() 13355 1727096175.16564: variable 'ansible_search_path' from source: unknown 13355 1727096175.16565: variable 'ansible_search_path' from source: unknown 13355 1727096175.16597: we have included files to process 13355 1727096175.16597: generating all_blocks data 13355 1727096175.16599: done generating all_blocks data 13355 1727096175.16602: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096175.16603: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096175.16604: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096175.16994: done processing included file 13355 1727096175.16996: iterating over new_blocks loaded from include file 13355 1727096175.16997: in VariableManager get_vars() 13355 1727096175.17019: done with get_vars() 13355 1727096175.17020: filtering new block on tags 13355 1727096175.17033: done filtering new block on tags 13355 1727096175.17035: in VariableManager get_vars() 13355 1727096175.17055: done with get_vars() 13355 1727096175.17057: filtering new block on tags 13355 1727096175.17072: done filtering new block on tags 13355 1727096175.17074: in VariableManager get_vars() 13355 1727096175.17093: done with get_vars() 13355 1727096175.17094: filtering new block on tags 13355 1727096175.17104: done filtering new block on tags 13355 1727096175.17105: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13355 1727096175.17109: extending task lists for all hosts with included blocks 13355 1727096175.17563: done extending task lists 13355 1727096175.17564: done processing included files 13355 1727096175.17565: results queue empty 13355 1727096175.17565: checking for any_errors_fatal 13355 1727096175.17569: done checking for any_errors_fatal 13355 1727096175.17570: checking for max_fail_percentage 13355 1727096175.17570: done checking for max_fail_percentage 13355 1727096175.17571: checking to see if all hosts have failed and the running result is not ok 13355 1727096175.17572: done checking to see if all hosts have failed 13355 1727096175.17572: getting the remaining hosts for this loop 13355 1727096175.17573: done getting the remaining hosts for this loop 13355 1727096175.17575: getting the next task for host managed_node3 13355 1727096175.17577: done getting next task for host managed_node3 13355 1727096175.17579: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096175.17581: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096175.17588: getting variables 13355 1727096175.17589: in VariableManager get_vars() 13355 1727096175.17606: Calling all_inventory to load vars for managed_node3 13355 1727096175.17607: Calling groups_inventory to load vars for managed_node3 13355 1727096175.17609: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096175.17613: Calling all_plugins_play to load vars for managed_node3 13355 1727096175.17614: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096175.17616: Calling groups_plugins_play to load vars for managed_node3 13355 1727096175.18310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096175.19250: done with get_vars() 13355 1727096175.19273: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:56:15 -0400 (0:00:00.058) 0:00:24.453 ****** 13355 1727096175.19335: entering _queue_task() for managed_node3/setup 13355 1727096175.19618: worker is 1 (out of 1 available) 13355 1727096175.19631: exiting _queue_task() for managed_node3/setup 13355 1727096175.19643: done queuing things up, now waiting for results queue to drain 13355 1727096175.19645: waiting for pending results... 13355 1727096175.19839: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096175.19950: in run() - task 0afff68d-5257-c514-593f-0000000006c5 13355 1727096175.19965: variable 'ansible_search_path' from source: unknown 13355 1727096175.19970: variable 'ansible_search_path' from source: unknown 13355 1727096175.20004: calling self._execute() 13355 1727096175.20084: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096175.20089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096175.20101: variable 'omit' from source: magic vars 13355 1727096175.20384: variable 'ansible_distribution_major_version' from source: facts 13355 1727096175.20394: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096175.20544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096175.22065: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096175.22122: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096175.22151: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096175.22185: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096175.22203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096175.22264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096175.22291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096175.22309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096175.22335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096175.22346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096175.22391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096175.22408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096175.22426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096175.22450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096175.22463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096175.22582: variable '__network_required_facts' from source: role '' defaults 13355 1727096175.22590: variable 'ansible_facts' from source: unknown 13355 1727096175.23061: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13355 1727096175.23065: when evaluation is False, skipping this task 13355 1727096175.23070: _execute() done 13355 1727096175.23072: dumping result to json 13355 1727096175.23074: done dumping result, returning 13355 1727096175.23081: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-c514-593f-0000000006c5] 13355 1727096175.23086: sending task result for task 0afff68d-5257-c514-593f-0000000006c5 13355 1727096175.23175: done sending task result for task 0afff68d-5257-c514-593f-0000000006c5 13355 1727096175.23178: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096175.23226: no more pending results, returning what we have 13355 1727096175.23230: results queue empty 13355 1727096175.23231: checking for any_errors_fatal 13355 1727096175.23232: done checking for any_errors_fatal 13355 1727096175.23233: checking for max_fail_percentage 13355 1727096175.23234: done checking for max_fail_percentage 13355 1727096175.23235: checking to see if all hosts have failed and the running result is not ok 13355 1727096175.23236: done checking to see if all hosts have failed 13355 1727096175.23236: getting the remaining hosts for this loop 13355 1727096175.23238: done getting the remaining hosts for this loop 13355 1727096175.23241: getting the next task for host managed_node3 13355 1727096175.23249: done getting next task for host managed_node3 13355 1727096175.23252: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096175.23256: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096175.23276: getting variables 13355 1727096175.23278: in VariableManager get_vars() 13355 1727096175.23337: Calling all_inventory to load vars for managed_node3 13355 1727096175.23340: Calling groups_inventory to load vars for managed_node3 13355 1727096175.23342: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096175.23351: Calling all_plugins_play to load vars for managed_node3 13355 1727096175.23354: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096175.23357: Calling groups_plugins_play to load vars for managed_node3 13355 1727096175.24191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096175.25073: done with get_vars() 13355 1727096175.25097: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:56:15 -0400 (0:00:00.058) 0:00:24.512 ****** 13355 1727096175.25185: entering _queue_task() for managed_node3/stat 13355 1727096175.25450: worker is 1 (out of 1 available) 13355 1727096175.25463: exiting _queue_task() for managed_node3/stat 13355 1727096175.25477: done queuing things up, now waiting for results queue to drain 13355 1727096175.25479: waiting for pending results... 13355 1727096175.25662: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096175.25769: in run() - task 0afff68d-5257-c514-593f-0000000006c7 13355 1727096175.25780: variable 'ansible_search_path' from source: unknown 13355 1727096175.25783: variable 'ansible_search_path' from source: unknown 13355 1727096175.25819: calling self._execute() 13355 1727096175.25891: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096175.25895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096175.25904: variable 'omit' from source: magic vars 13355 1727096175.26187: variable 'ansible_distribution_major_version' from source: facts 13355 1727096175.26196: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096175.26313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096175.26515: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096175.26549: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096175.26583: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096175.26607: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096175.26672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096175.26694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096175.26713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096175.26731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096175.26803: variable '__network_is_ostree' from source: set_fact 13355 1727096175.26810: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096175.26812: when evaluation is False, skipping this task 13355 1727096175.26815: _execute() done 13355 1727096175.26817: dumping result to json 13355 1727096175.26820: done dumping result, returning 13355 1727096175.26828: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-c514-593f-0000000006c7] 13355 1727096175.26833: sending task result for task 0afff68d-5257-c514-593f-0000000006c7 13355 1727096175.26922: done sending task result for task 0afff68d-5257-c514-593f-0000000006c7 13355 1727096175.26925: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096175.26975: no more pending results, returning what we have 13355 1727096175.26979: results queue empty 13355 1727096175.26980: checking for any_errors_fatal 13355 1727096175.26987: done checking for any_errors_fatal 13355 1727096175.26987: checking for max_fail_percentage 13355 1727096175.26989: done checking for max_fail_percentage 13355 1727096175.26990: checking to see if all hosts have failed and the running result is not ok 13355 1727096175.26990: done checking to see if all hosts have failed 13355 1727096175.26991: getting the remaining hosts for this loop 13355 1727096175.26992: done getting the remaining hosts for this loop 13355 1727096175.26996: getting the next task for host managed_node3 13355 1727096175.27002: done getting next task for host managed_node3 13355 1727096175.27006: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096175.27010: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096175.27027: getting variables 13355 1727096175.27028: in VariableManager get_vars() 13355 1727096175.27087: Calling all_inventory to load vars for managed_node3 13355 1727096175.27090: Calling groups_inventory to load vars for managed_node3 13355 1727096175.27092: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096175.27101: Calling all_plugins_play to load vars for managed_node3 13355 1727096175.27104: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096175.27106: Calling groups_plugins_play to load vars for managed_node3 13355 1727096175.28057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096175.28915: done with get_vars() 13355 1727096175.28939: done getting variables 13355 1727096175.28986: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:56:15 -0400 (0:00:00.038) 0:00:24.550 ****** 13355 1727096175.29012: entering _queue_task() for managed_node3/set_fact 13355 1727096175.29281: worker is 1 (out of 1 available) 13355 1727096175.29295: exiting _queue_task() for managed_node3/set_fact 13355 1727096175.29309: done queuing things up, now waiting for results queue to drain 13355 1727096175.29310: waiting for pending results... 13355 1727096175.29498: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096175.29617: in run() - task 0afff68d-5257-c514-593f-0000000006c8 13355 1727096175.29627: variable 'ansible_search_path' from source: unknown 13355 1727096175.29631: variable 'ansible_search_path' from source: unknown 13355 1727096175.29662: calling self._execute() 13355 1727096175.29738: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096175.29744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096175.29753: variable 'omit' from source: magic vars 13355 1727096175.30033: variable 'ansible_distribution_major_version' from source: facts 13355 1727096175.30045: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096175.30163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096175.30360: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096175.30398: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096175.30423: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096175.30448: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096175.30515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096175.30533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096175.30551: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096175.30572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096175.30639: variable '__network_is_ostree' from source: set_fact 13355 1727096175.30646: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096175.30649: when evaluation is False, skipping this task 13355 1727096175.30651: _execute() done 13355 1727096175.30654: dumping result to json 13355 1727096175.30660: done dumping result, returning 13355 1727096175.30669: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-c514-593f-0000000006c8] 13355 1727096175.30674: sending task result for task 0afff68d-5257-c514-593f-0000000006c8 13355 1727096175.30760: done sending task result for task 0afff68d-5257-c514-593f-0000000006c8 13355 1727096175.30763: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096175.30811: no more pending results, returning what we have 13355 1727096175.30815: results queue empty 13355 1727096175.30815: checking for any_errors_fatal 13355 1727096175.30822: done checking for any_errors_fatal 13355 1727096175.30822: checking for max_fail_percentage 13355 1727096175.30824: done checking for max_fail_percentage 13355 1727096175.30824: checking to see if all hosts have failed and the running result is not ok 13355 1727096175.30825: done checking to see if all hosts have failed 13355 1727096175.30826: getting the remaining hosts for this loop 13355 1727096175.30827: done getting the remaining hosts for this loop 13355 1727096175.30830: getting the next task for host managed_node3 13355 1727096175.30840: done getting next task for host managed_node3 13355 1727096175.30843: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096175.30847: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096175.30866: getting variables 13355 1727096175.30874: in VariableManager get_vars() 13355 1727096175.30923: Calling all_inventory to load vars for managed_node3 13355 1727096175.30926: Calling groups_inventory to load vars for managed_node3 13355 1727096175.30928: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096175.30937: Calling all_plugins_play to load vars for managed_node3 13355 1727096175.30940: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096175.30942: Calling groups_plugins_play to load vars for managed_node3 13355 1727096175.31739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096175.32592: done with get_vars() 13355 1727096175.32613: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:56:15 -0400 (0:00:00.036) 0:00:24.587 ****** 13355 1727096175.32687: entering _queue_task() for managed_node3/service_facts 13355 1727096175.32984: worker is 1 (out of 1 available) 13355 1727096175.32999: exiting _queue_task() for managed_node3/service_facts 13355 1727096175.33015: done queuing things up, now waiting for results queue to drain 13355 1727096175.33016: waiting for pending results... 13355 1727096175.33227: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096175.33334: in run() - task 0afff68d-5257-c514-593f-0000000006ca 13355 1727096175.33348: variable 'ansible_search_path' from source: unknown 13355 1727096175.33351: variable 'ansible_search_path' from source: unknown 13355 1727096175.33383: calling self._execute() 13355 1727096175.33457: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096175.33461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096175.33471: variable 'omit' from source: magic vars 13355 1727096175.33741: variable 'ansible_distribution_major_version' from source: facts 13355 1727096175.33750: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096175.33757: variable 'omit' from source: magic vars 13355 1727096175.33806: variable 'omit' from source: magic vars 13355 1727096175.33833: variable 'omit' from source: magic vars 13355 1727096175.33869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096175.33897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096175.33915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096175.33928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096175.33938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096175.33962: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096175.33966: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096175.33970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096175.34041: Set connection var ansible_shell_executable to /bin/sh 13355 1727096175.34045: Set connection var ansible_shell_type to sh 13355 1727096175.34051: Set connection var ansible_pipelining to False 13355 1727096175.34057: Set connection var ansible_connection to ssh 13355 1727096175.34059: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096175.34065: Set connection var ansible_timeout to 10 13355 1727096175.34085: variable 'ansible_shell_executable' from source: unknown 13355 1727096175.34088: variable 'ansible_connection' from source: unknown 13355 1727096175.34091: variable 'ansible_module_compression' from source: unknown 13355 1727096175.34094: variable 'ansible_shell_type' from source: unknown 13355 1727096175.34096: variable 'ansible_shell_executable' from source: unknown 13355 1727096175.34099: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096175.34101: variable 'ansible_pipelining' from source: unknown 13355 1727096175.34103: variable 'ansible_timeout' from source: unknown 13355 1727096175.34105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096175.34251: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096175.34261: variable 'omit' from source: magic vars 13355 1727096175.34266: starting attempt loop 13355 1727096175.34270: running the handler 13355 1727096175.34282: _low_level_execute_command(): starting 13355 1727096175.34289: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096175.34898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096175.34902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096175.34905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096175.34907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096175.34910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096175.34913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096175.34915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096175.34917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096175.34918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096175.35002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096175.35037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096175.36692: stdout chunk (state=3): >>>/root <<< 13355 1727096175.36793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096175.36820: stderr chunk (state=3): >>><<< 13355 1727096175.36823: stdout chunk (state=3): >>><<< 13355 1727096175.36844: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096175.36861: _low_level_execute_command(): starting 13355 1727096175.36869: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838 `" && echo ansible-tmp-1727096175.3684285-14481-105642663033838="` echo /root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838 `" ) && sleep 0' 13355 1727096175.37323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096175.37327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096175.37337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096175.37340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096175.37343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096175.37377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096175.37380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096175.37425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096175.39341: stdout chunk (state=3): >>>ansible-tmp-1727096175.3684285-14481-105642663033838=/root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838 <<< 13355 1727096175.39496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096175.39500: stdout chunk (state=3): >>><<< 13355 1727096175.39503: stderr chunk (state=3): >>><<< 13355 1727096175.39518: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096175.3684285-14481-105642663033838=/root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096175.39574: variable 'ansible_module_compression' from source: unknown 13355 1727096175.39676: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13355 1727096175.39680: variable 'ansible_facts' from source: unknown 13355 1727096175.39769: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/AnsiballZ_service_facts.py 13355 1727096175.39910: Sending initial data 13355 1727096175.40009: Sent initial data (162 bytes) 13355 1727096175.40570: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096175.40588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096175.40674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096175.40717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096175.40745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096175.40760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096175.40848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096175.42490: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096175.42522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096175.42556: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp11bvup2n /root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/AnsiballZ_service_facts.py <<< 13355 1727096175.42577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/AnsiballZ_service_facts.py" <<< 13355 1727096175.42596: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13355 1727096175.42619: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp11bvup2n" to remote "/root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/AnsiballZ_service_facts.py" <<< 13355 1727096175.43297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096175.43350: stderr chunk (state=3): >>><<< 13355 1727096175.43364: stdout chunk (state=3): >>><<< 13355 1727096175.43525: done transferring module to remote 13355 1727096175.43528: _low_level_execute_command(): starting 13355 1727096175.43531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/ /root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/AnsiballZ_service_facts.py && sleep 0' 13355 1727096175.44185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096175.44237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096175.44260: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096175.44281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096175.44341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096175.46254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096175.46258: stdout chunk (state=3): >>><<< 13355 1727096175.46261: stderr chunk (state=3): >>><<< 13355 1727096175.46362: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096175.46365: _low_level_execute_command(): starting 13355 1727096175.46371: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/AnsiballZ_service_facts.py && sleep 0' 13355 1727096175.46942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096175.46957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096175.46973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096175.47038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096175.47093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096175.47110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096175.47138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096175.47208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096177.13308: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13355 1727096177.15275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096177.15279: stdout chunk (state=3): >>><<< 13355 1727096177.15282: stderr chunk (state=3): >>><<< 13355 1727096177.15287: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096177.16708: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096177.16731: _low_level_execute_command(): starting 13355 1727096177.16744: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096175.3684285-14481-105642663033838/ > /dev/null 2>&1 && sleep 0' 13355 1727096177.17394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096177.17398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096177.17427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096177.17431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096177.17434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096177.17436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096177.17486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096177.17490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096177.17497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096177.17550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096177.19477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096177.19481: stdout chunk (state=3): >>><<< 13355 1727096177.19512: stderr chunk (state=3): >>><<< 13355 1727096177.19516: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096177.19518: handler run complete 13355 1727096177.19742: variable 'ansible_facts' from source: unknown 13355 1727096177.19943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096177.20359: variable 'ansible_facts' from source: unknown 13355 1727096177.20443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096177.20564: attempt loop complete, returning result 13355 1727096177.20569: _execute() done 13355 1727096177.20572: dumping result to json 13355 1727096177.20610: done dumping result, returning 13355 1727096177.20619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-c514-593f-0000000006ca] 13355 1727096177.20624: sending task result for task 0afff68d-5257-c514-593f-0000000006ca ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096177.21526: no more pending results, returning what we have 13355 1727096177.21529: results queue empty 13355 1727096177.21530: checking for any_errors_fatal 13355 1727096177.21534: done checking for any_errors_fatal 13355 1727096177.21535: checking for max_fail_percentage 13355 1727096177.21536: done checking for max_fail_percentage 13355 1727096177.21537: checking to see if all hosts have failed and the running result is not ok 13355 1727096177.21538: done checking to see if all hosts have failed 13355 1727096177.21539: getting the remaining hosts for this loop 13355 1727096177.21540: done getting the remaining hosts for this loop 13355 1727096177.21543: getting the next task for host managed_node3 13355 1727096177.21548: done getting next task for host managed_node3 13355 1727096177.21552: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096177.21557: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096177.21569: getting variables 13355 1727096177.21570: in VariableManager get_vars() 13355 1727096177.21613: Calling all_inventory to load vars for managed_node3 13355 1727096177.21616: Calling groups_inventory to load vars for managed_node3 13355 1727096177.21618: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096177.21628: Calling all_plugins_play to load vars for managed_node3 13355 1727096177.21631: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096177.21634: Calling groups_plugins_play to load vars for managed_node3 13355 1727096177.22182: done sending task result for task 0afff68d-5257-c514-593f-0000000006ca 13355 1727096177.22186: WORKER PROCESS EXITING 13355 1727096177.22838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096177.24211: done with get_vars() 13355 1727096177.24237: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:56:17 -0400 (0:00:01.916) 0:00:26.503 ****** 13355 1727096177.24316: entering _queue_task() for managed_node3/package_facts 13355 1727096177.24581: worker is 1 (out of 1 available) 13355 1727096177.24594: exiting _queue_task() for managed_node3/package_facts 13355 1727096177.24608: done queuing things up, now waiting for results queue to drain 13355 1727096177.24609: waiting for pending results... 13355 1727096177.24795: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096177.24904: in run() - task 0afff68d-5257-c514-593f-0000000006cb 13355 1727096177.24917: variable 'ansible_search_path' from source: unknown 13355 1727096177.24921: variable 'ansible_search_path' from source: unknown 13355 1727096177.24948: calling self._execute() 13355 1727096177.25027: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096177.25032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096177.25040: variable 'omit' from source: magic vars 13355 1727096177.25331: variable 'ansible_distribution_major_version' from source: facts 13355 1727096177.25340: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096177.25346: variable 'omit' from source: magic vars 13355 1727096177.25458: variable 'omit' from source: magic vars 13355 1727096177.25462: variable 'omit' from source: magic vars 13355 1727096177.25534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096177.25543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096177.25547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096177.25773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096177.25776: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096177.25778: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096177.25781: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096177.25782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096177.25784: Set connection var ansible_shell_executable to /bin/sh 13355 1727096177.25786: Set connection var ansible_shell_type to sh 13355 1727096177.25789: Set connection var ansible_pipelining to False 13355 1727096177.25790: Set connection var ansible_connection to ssh 13355 1727096177.25792: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096177.25794: Set connection var ansible_timeout to 10 13355 1727096177.25795: variable 'ansible_shell_executable' from source: unknown 13355 1727096177.25804: variable 'ansible_connection' from source: unknown 13355 1727096177.25812: variable 'ansible_module_compression' from source: unknown 13355 1727096177.25827: variable 'ansible_shell_type' from source: unknown 13355 1727096177.25835: variable 'ansible_shell_executable' from source: unknown 13355 1727096177.25842: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096177.25848: variable 'ansible_pipelining' from source: unknown 13355 1727096177.25856: variable 'ansible_timeout' from source: unknown 13355 1727096177.25864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096177.26066: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096177.26094: variable 'omit' from source: magic vars 13355 1727096177.26105: starting attempt loop 13355 1727096177.26112: running the handler 13355 1727096177.26129: _low_level_execute_command(): starting 13355 1727096177.26141: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096177.26971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096177.26995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096177.27010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096177.27080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096177.28773: stdout chunk (state=3): >>>/root <<< 13355 1727096177.28884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096177.28903: stderr chunk (state=3): >>><<< 13355 1727096177.28906: stdout chunk (state=3): >>><<< 13355 1727096177.28924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096177.28936: _low_level_execute_command(): starting 13355 1727096177.28944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927 `" && echo ansible-tmp-1727096177.2892404-14563-216332321786927="` echo /root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927 `" ) && sleep 0' 13355 1727096177.29586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096177.29694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096177.29697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096177.29709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096177.29787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096177.31785: stdout chunk (state=3): >>>ansible-tmp-1727096177.2892404-14563-216332321786927=/root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927 <<< 13355 1727096177.31884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096177.31912: stderr chunk (state=3): >>><<< 13355 1727096177.31916: stdout chunk (state=3): >>><<< 13355 1727096177.31938: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096177.2892404-14563-216332321786927=/root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096177.31981: variable 'ansible_module_compression' from source: unknown 13355 1727096177.32020: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13355 1727096177.32076: variable 'ansible_facts' from source: unknown 13355 1727096177.32194: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/AnsiballZ_package_facts.py 13355 1727096177.32304: Sending initial data 13355 1727096177.32307: Sent initial data (162 bytes) 13355 1727096177.32769: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096177.32773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096177.32776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096177.32778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096177.32780: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096177.32828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096177.32832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096177.32834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096177.32876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096177.34510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096177.34540: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096177.34576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpf1gz0tr5 /root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/AnsiballZ_package_facts.py <<< 13355 1727096177.34582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/AnsiballZ_package_facts.py" <<< 13355 1727096177.34607: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpf1gz0tr5" to remote "/root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/AnsiballZ_package_facts.py" <<< 13355 1727096177.34609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/AnsiballZ_package_facts.py" <<< 13355 1727096177.35588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096177.35632: stderr chunk (state=3): >>><<< 13355 1727096177.35635: stdout chunk (state=3): >>><<< 13355 1727096177.35674: done transferring module to remote 13355 1727096177.35684: _low_level_execute_command(): starting 13355 1727096177.35689: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/ /root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/AnsiballZ_package_facts.py && sleep 0' 13355 1727096177.36125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096177.36129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096177.36141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096177.36201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096177.36209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096177.36216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096177.36244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096177.38091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096177.38193: stderr chunk (state=3): >>><<< 13355 1727096177.38197: stdout chunk (state=3): >>><<< 13355 1727096177.38200: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096177.38204: _low_level_execute_command(): starting 13355 1727096177.38206: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/AnsiballZ_package_facts.py && sleep 0' 13355 1727096177.38566: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096177.38584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096177.38596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096177.38651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096177.38668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096177.38674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096177.38702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096177.83715: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 13355 1727096177.83742: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 13355 1727096177.83760: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 13355 1727096177.83802: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 13355 1727096177.83819: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 13355 1727096177.83849: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 13355 1727096177.83858: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 13355 1727096177.83901: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 13355 1727096177.83915: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 13355 1727096177.83926: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 13355 1727096177.83949: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13355 1727096177.85850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096177.85885: stderr chunk (state=3): >>><<< 13355 1727096177.85888: stdout chunk (state=3): >>><<< 13355 1727096177.85926: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096177.87584: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096177.87589: _low_level_execute_command(): starting 13355 1727096177.87592: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096177.2892404-14563-216332321786927/ > /dev/null 2>&1 && sleep 0' 13355 1727096177.88222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096177.88256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096177.88278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096177.88299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096177.88364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096177.88427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096177.88470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096177.88496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096177.88576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096177.90549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096177.90558: stderr chunk (state=3): >>><<< 13355 1727096177.90560: stdout chunk (state=3): >>><<< 13355 1727096177.90759: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096177.90763: handler run complete 13355 1727096177.91147: variable 'ansible_facts' from source: unknown 13355 1727096177.95896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096177.96949: variable 'ansible_facts' from source: unknown 13355 1727096177.97195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096177.97577: attempt loop complete, returning result 13355 1727096177.97587: _execute() done 13355 1727096177.97590: dumping result to json 13355 1727096177.97707: done dumping result, returning 13355 1727096177.97715: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-c514-593f-0000000006cb] 13355 1727096177.97719: sending task result for task 0afff68d-5257-c514-593f-0000000006cb 13355 1727096178.03640: done sending task result for task 0afff68d-5257-c514-593f-0000000006cb 13355 1727096178.03644: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096178.03701: no more pending results, returning what we have 13355 1727096178.03703: results queue empty 13355 1727096178.03704: checking for any_errors_fatal 13355 1727096178.03706: done checking for any_errors_fatal 13355 1727096178.03707: checking for max_fail_percentage 13355 1727096178.03707: done checking for max_fail_percentage 13355 1727096178.03708: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.03708: done checking to see if all hosts have failed 13355 1727096178.03709: getting the remaining hosts for this loop 13355 1727096178.03710: done getting the remaining hosts for this loop 13355 1727096178.03712: getting the next task for host managed_node3 13355 1727096178.03715: done getting next task for host managed_node3 13355 1727096178.03717: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096178.03719: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.03724: getting variables 13355 1727096178.03725: in VariableManager get_vars() 13355 1727096178.03747: Calling all_inventory to load vars for managed_node3 13355 1727096178.03749: Calling groups_inventory to load vars for managed_node3 13355 1727096178.03750: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.03757: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.03759: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.03761: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.04563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.06186: done with get_vars() 13355 1727096178.06214: done getting variables 13355 1727096178.06273: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:56:18 -0400 (0:00:00.819) 0:00:27.323 ****** 13355 1727096178.06309: entering _queue_task() for managed_node3/debug 13355 1727096178.06679: worker is 1 (out of 1 available) 13355 1727096178.06692: exiting _queue_task() for managed_node3/debug 13355 1727096178.06705: done queuing things up, now waiting for results queue to drain 13355 1727096178.06706: waiting for pending results... 13355 1727096178.07087: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096178.07130: in run() - task 0afff68d-5257-c514-593f-00000000007c 13355 1727096178.07150: variable 'ansible_search_path' from source: unknown 13355 1727096178.07161: variable 'ansible_search_path' from source: unknown 13355 1727096178.07206: calling self._execute() 13355 1727096178.07312: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.07323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.07335: variable 'omit' from source: magic vars 13355 1727096178.07727: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.07745: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.07758: variable 'omit' from source: magic vars 13355 1727096178.07818: variable 'omit' from source: magic vars 13355 1727096178.07926: variable 'network_provider' from source: set_fact 13355 1727096178.07956: variable 'omit' from source: magic vars 13355 1727096178.08005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096178.08162: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096178.08166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096178.08170: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096178.08173: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096178.08175: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096178.08178: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.08180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.08279: Set connection var ansible_shell_executable to /bin/sh 13355 1727096178.08290: Set connection var ansible_shell_type to sh 13355 1727096178.08299: Set connection var ansible_pipelining to False 13355 1727096178.08308: Set connection var ansible_connection to ssh 13355 1727096178.08316: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096178.08325: Set connection var ansible_timeout to 10 13355 1727096178.08351: variable 'ansible_shell_executable' from source: unknown 13355 1727096178.08363: variable 'ansible_connection' from source: unknown 13355 1727096178.08375: variable 'ansible_module_compression' from source: unknown 13355 1727096178.08382: variable 'ansible_shell_type' from source: unknown 13355 1727096178.08388: variable 'ansible_shell_executable' from source: unknown 13355 1727096178.08394: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.08401: variable 'ansible_pipelining' from source: unknown 13355 1727096178.08417: variable 'ansible_timeout' from source: unknown 13355 1727096178.08428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.08582: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096178.08605: variable 'omit' from source: magic vars 13355 1727096178.08615: starting attempt loop 13355 1727096178.08622: running the handler 13355 1727096178.08699: handler run complete 13355 1727096178.08703: attempt loop complete, returning result 13355 1727096178.08705: _execute() done 13355 1727096178.08707: dumping result to json 13355 1727096178.08708: done dumping result, returning 13355 1727096178.08717: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-c514-593f-00000000007c] 13355 1727096178.08726: sending task result for task 0afff68d-5257-c514-593f-00000000007c 13355 1727096178.09073: done sending task result for task 0afff68d-5257-c514-593f-00000000007c 13355 1727096178.09077: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 13355 1727096178.09137: no more pending results, returning what we have 13355 1727096178.09141: results queue empty 13355 1727096178.09142: checking for any_errors_fatal 13355 1727096178.09151: done checking for any_errors_fatal 13355 1727096178.09152: checking for max_fail_percentage 13355 1727096178.09156: done checking for max_fail_percentage 13355 1727096178.09157: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.09158: done checking to see if all hosts have failed 13355 1727096178.09159: getting the remaining hosts for this loop 13355 1727096178.09160: done getting the remaining hosts for this loop 13355 1727096178.09164: getting the next task for host managed_node3 13355 1727096178.09173: done getting next task for host managed_node3 13355 1727096178.09177: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096178.09180: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.09192: getting variables 13355 1727096178.09194: in VariableManager get_vars() 13355 1727096178.09246: Calling all_inventory to load vars for managed_node3 13355 1727096178.09249: Calling groups_inventory to load vars for managed_node3 13355 1727096178.09252: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.09265: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.09387: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.09392: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.10899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.12538: done with get_vars() 13355 1727096178.12588: done getting variables 13355 1727096178.12731: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:56:18 -0400 (0:00:00.064) 0:00:27.388 ****** 13355 1727096178.12777: entering _queue_task() for managed_node3/fail 13355 1727096178.13149: worker is 1 (out of 1 available) 13355 1727096178.13166: exiting _queue_task() for managed_node3/fail 13355 1727096178.13281: done queuing things up, now waiting for results queue to drain 13355 1727096178.13283: waiting for pending results... 13355 1727096178.13483: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096178.13639: in run() - task 0afff68d-5257-c514-593f-00000000007d 13355 1727096178.13660: variable 'ansible_search_path' from source: unknown 13355 1727096178.13669: variable 'ansible_search_path' from source: unknown 13355 1727096178.13709: calling self._execute() 13355 1727096178.13809: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.13820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.13841: variable 'omit' from source: magic vars 13355 1727096178.14239: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.14256: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.14389: variable 'network_state' from source: role '' defaults 13355 1727096178.14405: Evaluated conditional (network_state != {}): False 13355 1727096178.14412: when evaluation is False, skipping this task 13355 1727096178.14419: _execute() done 13355 1727096178.14426: dumping result to json 13355 1727096178.14433: done dumping result, returning 13355 1727096178.14444: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-c514-593f-00000000007d] 13355 1727096178.14454: sending task result for task 0afff68d-5257-c514-593f-00000000007d skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096178.14636: no more pending results, returning what we have 13355 1727096178.14640: results queue empty 13355 1727096178.14641: checking for any_errors_fatal 13355 1727096178.14647: done checking for any_errors_fatal 13355 1727096178.14647: checking for max_fail_percentage 13355 1727096178.14649: done checking for max_fail_percentage 13355 1727096178.14650: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.14651: done checking to see if all hosts have failed 13355 1727096178.14651: getting the remaining hosts for this loop 13355 1727096178.14655: done getting the remaining hosts for this loop 13355 1727096178.14658: getting the next task for host managed_node3 13355 1727096178.14665: done getting next task for host managed_node3 13355 1727096178.14671: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096178.14674: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.14699: getting variables 13355 1727096178.14701: in VariableManager get_vars() 13355 1727096178.14757: Calling all_inventory to load vars for managed_node3 13355 1727096178.14759: Calling groups_inventory to load vars for managed_node3 13355 1727096178.14762: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.14777: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.14780: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.14783: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.15306: done sending task result for task 0afff68d-5257-c514-593f-00000000007d 13355 1727096178.15309: WORKER PROCESS EXITING 13355 1727096178.16112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.17735: done with get_vars() 13355 1727096178.17772: done getting variables 13355 1727096178.17846: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:56:18 -0400 (0:00:00.051) 0:00:27.439 ****** 13355 1727096178.17885: entering _queue_task() for managed_node3/fail 13355 1727096178.18391: worker is 1 (out of 1 available) 13355 1727096178.18403: exiting _queue_task() for managed_node3/fail 13355 1727096178.18414: done queuing things up, now waiting for results queue to drain 13355 1727096178.18416: waiting for pending results... 13355 1727096178.18658: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096178.18791: in run() - task 0afff68d-5257-c514-593f-00000000007e 13355 1727096178.18812: variable 'ansible_search_path' from source: unknown 13355 1727096178.18821: variable 'ansible_search_path' from source: unknown 13355 1727096178.18870: calling self._execute() 13355 1727096178.18986: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.19076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.19080: variable 'omit' from source: magic vars 13355 1727096178.19431: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.19446: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.19570: variable 'network_state' from source: role '' defaults 13355 1727096178.19588: Evaluated conditional (network_state != {}): False 13355 1727096178.19597: when evaluation is False, skipping this task 13355 1727096178.19618: _execute() done 13355 1727096178.19621: dumping result to json 13355 1727096178.19738: done dumping result, returning 13355 1727096178.19743: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-c514-593f-00000000007e] 13355 1727096178.19746: sending task result for task 0afff68d-5257-c514-593f-00000000007e 13355 1727096178.19816: done sending task result for task 0afff68d-5257-c514-593f-00000000007e 13355 1727096178.19819: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096178.19892: no more pending results, returning what we have 13355 1727096178.19896: results queue empty 13355 1727096178.19897: checking for any_errors_fatal 13355 1727096178.19903: done checking for any_errors_fatal 13355 1727096178.19904: checking for max_fail_percentage 13355 1727096178.19905: done checking for max_fail_percentage 13355 1727096178.19906: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.19907: done checking to see if all hosts have failed 13355 1727096178.19907: getting the remaining hosts for this loop 13355 1727096178.19909: done getting the remaining hosts for this loop 13355 1727096178.19912: getting the next task for host managed_node3 13355 1727096178.19919: done getting next task for host managed_node3 13355 1727096178.19923: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096178.19927: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.20187: getting variables 13355 1727096178.20188: in VariableManager get_vars() 13355 1727096178.20237: Calling all_inventory to load vars for managed_node3 13355 1727096178.20240: Calling groups_inventory to load vars for managed_node3 13355 1727096178.20243: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.20251: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.20254: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.20257: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.21790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.23393: done with get_vars() 13355 1727096178.23423: done getting variables 13355 1727096178.23480: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:56:18 -0400 (0:00:00.056) 0:00:27.495 ****** 13355 1727096178.23514: entering _queue_task() for managed_node3/fail 13355 1727096178.23996: worker is 1 (out of 1 available) 13355 1727096178.24009: exiting _queue_task() for managed_node3/fail 13355 1727096178.24019: done queuing things up, now waiting for results queue to drain 13355 1727096178.24021: waiting for pending results... 13355 1727096178.24297: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096178.24502: in run() - task 0afff68d-5257-c514-593f-00000000007f 13355 1727096178.24506: variable 'ansible_search_path' from source: unknown 13355 1727096178.24510: variable 'ansible_search_path' from source: unknown 13355 1727096178.24512: calling self._execute() 13355 1727096178.24598: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.24622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.24640: variable 'omit' from source: magic vars 13355 1727096178.25035: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.25060: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.25245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096178.27476: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096178.27549: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096178.27658: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096178.27662: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096178.27664: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096178.27741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.27795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.27826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.27879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.27897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.27999: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.28018: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13355 1727096178.28141: variable 'ansible_distribution' from source: facts 13355 1727096178.28149: variable '__network_rh_distros' from source: role '' defaults 13355 1727096178.28163: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13355 1727096178.28572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.28575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.28578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.28581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.28583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.28585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.28615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.28644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.28686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.28711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.28754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.28784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.28817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.28858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.28878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.29197: variable 'network_connections' from source: task vars 13355 1727096178.29213: variable 'controller_profile' from source: play vars 13355 1727096178.29287: variable 'controller_profile' from source: play vars 13355 1727096178.29355: variable 'network_state' from source: role '' defaults 13355 1727096178.29381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096178.29551: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096178.29601: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096178.29634: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096178.29666: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096178.29720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096178.29749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096178.29893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.29896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096178.29899: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13355 1727096178.29904: when evaluation is False, skipping this task 13355 1727096178.29906: _execute() done 13355 1727096178.29908: dumping result to json 13355 1727096178.29910: done dumping result, returning 13355 1727096178.29913: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-c514-593f-00000000007f] 13355 1727096178.29915: sending task result for task 0afff68d-5257-c514-593f-00000000007f skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13355 1727096178.30120: no more pending results, returning what we have 13355 1727096178.30123: results queue empty 13355 1727096178.30124: checking for any_errors_fatal 13355 1727096178.30132: done checking for any_errors_fatal 13355 1727096178.30133: checking for max_fail_percentage 13355 1727096178.30135: done checking for max_fail_percentage 13355 1727096178.30135: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.30136: done checking to see if all hosts have failed 13355 1727096178.30137: getting the remaining hosts for this loop 13355 1727096178.30138: done getting the remaining hosts for this loop 13355 1727096178.30142: getting the next task for host managed_node3 13355 1727096178.30149: done getting next task for host managed_node3 13355 1727096178.30153: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096178.30155: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.30177: getting variables 13355 1727096178.30178: in VariableManager get_vars() 13355 1727096178.30233: Calling all_inventory to load vars for managed_node3 13355 1727096178.30236: Calling groups_inventory to load vars for managed_node3 13355 1727096178.30239: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.30249: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.30253: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.30256: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.30921: done sending task result for task 0afff68d-5257-c514-593f-00000000007f 13355 1727096178.30924: WORKER PROCESS EXITING 13355 1727096178.31912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.33507: done with get_vars() 13355 1727096178.33531: done getting variables 13355 1727096178.33595: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:56:18 -0400 (0:00:00.101) 0:00:27.596 ****** 13355 1727096178.33628: entering _queue_task() for managed_node3/dnf 13355 1727096178.34095: worker is 1 (out of 1 available) 13355 1727096178.34106: exiting _queue_task() for managed_node3/dnf 13355 1727096178.34119: done queuing things up, now waiting for results queue to drain 13355 1727096178.34120: waiting for pending results... 13355 1727096178.34362: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096178.34504: in run() - task 0afff68d-5257-c514-593f-000000000080 13355 1727096178.34523: variable 'ansible_search_path' from source: unknown 13355 1727096178.34531: variable 'ansible_search_path' from source: unknown 13355 1727096178.34581: calling self._execute() 13355 1727096178.34697: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.34709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.34723: variable 'omit' from source: magic vars 13355 1727096178.35107: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.35128: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.35318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096178.37775: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096178.37914: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096178.37919: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096178.37952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096178.37989: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096178.38077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.38112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.38151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.38201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.38240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.38365: variable 'ansible_distribution' from source: facts 13355 1727096178.38457: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.38461: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13355 1727096178.38532: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096178.38691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.38718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.38747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.38799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.38819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.38861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.39072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.39076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.39078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.39080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.39083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.39084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.39087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.39116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.39135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.39322: variable 'network_connections' from source: task vars 13355 1727096178.39340: variable 'controller_profile' from source: play vars 13355 1727096178.39406: variable 'controller_profile' from source: play vars 13355 1727096178.39491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096178.39675: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096178.39717: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096178.39758: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096178.39793: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096178.39840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096178.39877: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096178.39917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.39947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096178.40003: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096178.40293: variable 'network_connections' from source: task vars 13355 1727096178.40296: variable 'controller_profile' from source: play vars 13355 1727096178.40328: variable 'controller_profile' from source: play vars 13355 1727096178.40357: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096178.40365: when evaluation is False, skipping this task 13355 1727096178.40375: _execute() done 13355 1727096178.40382: dumping result to json 13355 1727096178.40389: done dumping result, returning 13355 1727096178.40473: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000080] 13355 1727096178.40476: sending task result for task 0afff68d-5257-c514-593f-000000000080 13355 1727096178.40773: done sending task result for task 0afff68d-5257-c514-593f-000000000080 13355 1727096178.40777: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096178.40824: no more pending results, returning what we have 13355 1727096178.40828: results queue empty 13355 1727096178.40829: checking for any_errors_fatal 13355 1727096178.40834: done checking for any_errors_fatal 13355 1727096178.40835: checking for max_fail_percentage 13355 1727096178.40837: done checking for max_fail_percentage 13355 1727096178.40837: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.40838: done checking to see if all hosts have failed 13355 1727096178.40839: getting the remaining hosts for this loop 13355 1727096178.40840: done getting the remaining hosts for this loop 13355 1727096178.40843: getting the next task for host managed_node3 13355 1727096178.40849: done getting next task for host managed_node3 13355 1727096178.40853: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096178.40856: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.40875: getting variables 13355 1727096178.40876: in VariableManager get_vars() 13355 1727096178.40926: Calling all_inventory to load vars for managed_node3 13355 1727096178.40929: Calling groups_inventory to load vars for managed_node3 13355 1727096178.40931: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.40941: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.40944: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.40947: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.42505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.44101: done with get_vars() 13355 1727096178.44130: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096178.44205: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:56:18 -0400 (0:00:00.106) 0:00:27.703 ****** 13355 1727096178.44241: entering _queue_task() for managed_node3/yum 13355 1727096178.44695: worker is 1 (out of 1 available) 13355 1727096178.44706: exiting _queue_task() for managed_node3/yum 13355 1727096178.44716: done queuing things up, now waiting for results queue to drain 13355 1727096178.44717: waiting for pending results... 13355 1727096178.44889: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096178.45050: in run() - task 0afff68d-5257-c514-593f-000000000081 13355 1727096178.45074: variable 'ansible_search_path' from source: unknown 13355 1727096178.45082: variable 'ansible_search_path' from source: unknown 13355 1727096178.45125: calling self._execute() 13355 1727096178.45237: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.45249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.45332: variable 'omit' from source: magic vars 13355 1727096178.45636: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.45685: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.45920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096178.48678: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096178.48755: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096178.48852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096178.48856: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096178.48871: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096178.48948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.49004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.49072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.49085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.49272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.49275: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.49278: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13355 1727096178.49280: when evaluation is False, skipping this task 13355 1727096178.49282: _execute() done 13355 1727096178.49284: dumping result to json 13355 1727096178.49286: done dumping result, returning 13355 1727096178.49288: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000081] 13355 1727096178.49290: sending task result for task 0afff68d-5257-c514-593f-000000000081 13355 1727096178.49362: done sending task result for task 0afff68d-5257-c514-593f-000000000081 13355 1727096178.49366: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13355 1727096178.49432: no more pending results, returning what we have 13355 1727096178.49436: results queue empty 13355 1727096178.49439: checking for any_errors_fatal 13355 1727096178.49446: done checking for any_errors_fatal 13355 1727096178.49447: checking for max_fail_percentage 13355 1727096178.49449: done checking for max_fail_percentage 13355 1727096178.49450: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.49451: done checking to see if all hosts have failed 13355 1727096178.49451: getting the remaining hosts for this loop 13355 1727096178.49453: done getting the remaining hosts for this loop 13355 1727096178.49457: getting the next task for host managed_node3 13355 1727096178.49471: done getting next task for host managed_node3 13355 1727096178.49476: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096178.49480: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.49498: getting variables 13355 1727096178.49500: in VariableManager get_vars() 13355 1727096178.49555: Calling all_inventory to load vars for managed_node3 13355 1727096178.49558: Calling groups_inventory to load vars for managed_node3 13355 1727096178.49561: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.49685: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.49690: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.49694: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.51282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.53488: done with get_vars() 13355 1727096178.53522: done getting variables 13355 1727096178.53739: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:56:18 -0400 (0:00:00.095) 0:00:27.798 ****** 13355 1727096178.53774: entering _queue_task() for managed_node3/fail 13355 1727096178.54295: worker is 1 (out of 1 available) 13355 1727096178.54310: exiting _queue_task() for managed_node3/fail 13355 1727096178.54323: done queuing things up, now waiting for results queue to drain 13355 1727096178.54325: waiting for pending results... 13355 1727096178.54587: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096178.54734: in run() - task 0afff68d-5257-c514-593f-000000000082 13355 1727096178.54790: variable 'ansible_search_path' from source: unknown 13355 1727096178.54793: variable 'ansible_search_path' from source: unknown 13355 1727096178.54807: calling self._execute() 13355 1727096178.54913: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.54923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.54937: variable 'omit' from source: magic vars 13355 1727096178.55332: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.55337: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.55460: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096178.55766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096178.59760: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096178.59840: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096178.59885: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096178.59928: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096178.60276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096178.60280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.60407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.60437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.60481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.60505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.60621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.60646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.60733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.60776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.60831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.60964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.60991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.61017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.61070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.61159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.61461: variable 'network_connections' from source: task vars 13355 1727096178.61773: variable 'controller_profile' from source: play vars 13355 1727096178.61776: variable 'controller_profile' from source: play vars 13355 1727096178.61837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096178.62240: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096178.62283: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096178.62365: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096178.62476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096178.62524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096178.62577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096178.62690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.62719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096178.62974: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096178.63330: variable 'network_connections' from source: task vars 13355 1727096178.63415: variable 'controller_profile' from source: play vars 13355 1727096178.63479: variable 'controller_profile' from source: play vars 13355 1727096178.63541: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096178.63629: when evaluation is False, skipping this task 13355 1727096178.63639: _execute() done 13355 1727096178.63646: dumping result to json 13355 1727096178.63653: done dumping result, returning 13355 1727096178.63666: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000082] 13355 1727096178.63678: sending task result for task 0afff68d-5257-c514-593f-000000000082 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096178.63848: no more pending results, returning what we have 13355 1727096178.63852: results queue empty 13355 1727096178.63853: checking for any_errors_fatal 13355 1727096178.63860: done checking for any_errors_fatal 13355 1727096178.63860: checking for max_fail_percentage 13355 1727096178.63862: done checking for max_fail_percentage 13355 1727096178.63863: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.63864: done checking to see if all hosts have failed 13355 1727096178.63864: getting the remaining hosts for this loop 13355 1727096178.63866: done getting the remaining hosts for this loop 13355 1727096178.63871: getting the next task for host managed_node3 13355 1727096178.63879: done getting next task for host managed_node3 13355 1727096178.63883: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13355 1727096178.63886: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.63905: getting variables 13355 1727096178.63906: in VariableManager get_vars() 13355 1727096178.63959: Calling all_inventory to load vars for managed_node3 13355 1727096178.63962: Calling groups_inventory to load vars for managed_node3 13355 1727096178.63965: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.64378: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.64383: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.64386: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.65179: done sending task result for task 0afff68d-5257-c514-593f-000000000082 13355 1727096178.65183: WORKER PROCESS EXITING 13355 1727096178.67022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.70228: done with get_vars() 13355 1727096178.70264: done getting variables 13355 1727096178.70325: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:56:18 -0400 (0:00:00.165) 0:00:27.964 ****** 13355 1727096178.70362: entering _queue_task() for managed_node3/package 13355 1727096178.70865: worker is 1 (out of 1 available) 13355 1727096178.70881: exiting _queue_task() for managed_node3/package 13355 1727096178.70896: done queuing things up, now waiting for results queue to drain 13355 1727096178.70897: waiting for pending results... 13355 1727096178.71159: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13355 1727096178.71324: in run() - task 0afff68d-5257-c514-593f-000000000083 13355 1727096178.71345: variable 'ansible_search_path' from source: unknown 13355 1727096178.71365: variable 'ansible_search_path' from source: unknown 13355 1727096178.71407: calling self._execute() 13355 1727096178.71517: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.71528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.71541: variable 'omit' from source: magic vars 13355 1727096178.71944: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.71963: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.72196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096178.72486: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096178.72534: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096178.72584: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096178.72662: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096178.72780: variable 'network_packages' from source: role '' defaults 13355 1727096178.72917: variable '__network_provider_setup' from source: role '' defaults 13355 1727096178.72941: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096178.73014: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096178.73099: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096178.73103: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096178.73492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096178.78363: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096178.78633: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096178.78671: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096178.78926: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096178.78929: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096178.79009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.79038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.79074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.79172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.79177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.79179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.79391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.79415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.79454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.79469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.79894: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096178.80204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.80226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.80250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.80372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.80377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.80588: variable 'ansible_python' from source: facts 13355 1727096178.80616: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096178.80697: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096178.81072: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096178.81279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.81304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.81325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.81362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.81378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.81419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096178.81441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096178.81462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.81706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096178.81720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096178.81861: variable 'network_connections' from source: task vars 13355 1727096178.81870: variable 'controller_profile' from source: play vars 13355 1727096178.81964: variable 'controller_profile' from source: play vars 13355 1727096178.82235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096178.82261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096178.82294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096178.82322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096178.82368: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096178.83043: variable 'network_connections' from source: task vars 13355 1727096178.83046: variable 'controller_profile' from source: play vars 13355 1727096178.83146: variable 'controller_profile' from source: play vars 13355 1727096178.83379: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096178.83459: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096178.83972: variable 'network_connections' from source: task vars 13355 1727096178.84184: variable 'controller_profile' from source: play vars 13355 1727096178.84245: variable 'controller_profile' from source: play vars 13355 1727096178.84267: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096178.84346: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096178.85040: variable 'network_connections' from source: task vars 13355 1727096178.85043: variable 'controller_profile' from source: play vars 13355 1727096178.85329: variable 'controller_profile' from source: play vars 13355 1727096178.85332: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096178.85417: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096178.85423: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096178.85484: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096178.85891: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096178.86976: variable 'network_connections' from source: task vars 13355 1727096178.86982: variable 'controller_profile' from source: play vars 13355 1727096178.87042: variable 'controller_profile' from source: play vars 13355 1727096178.87050: variable 'ansible_distribution' from source: facts 13355 1727096178.87053: variable '__network_rh_distros' from source: role '' defaults 13355 1727096178.87059: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.87077: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096178.87440: variable 'ansible_distribution' from source: facts 13355 1727096178.87444: variable '__network_rh_distros' from source: role '' defaults 13355 1727096178.87449: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.87464: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096178.87819: variable 'ansible_distribution' from source: facts 13355 1727096178.87822: variable '__network_rh_distros' from source: role '' defaults 13355 1727096178.87834: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.87866: variable 'network_provider' from source: set_fact 13355 1727096178.87990: variable 'ansible_facts' from source: unknown 13355 1727096178.88721: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13355 1727096178.88724: when evaluation is False, skipping this task 13355 1727096178.88727: _execute() done 13355 1727096178.88729: dumping result to json 13355 1727096178.88731: done dumping result, returning 13355 1727096178.88741: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-c514-593f-000000000083] 13355 1727096178.88745: sending task result for task 0afff68d-5257-c514-593f-000000000083 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13355 1727096178.89018: no more pending results, returning what we have 13355 1727096178.89022: results queue empty 13355 1727096178.89022: checking for any_errors_fatal 13355 1727096178.89029: done checking for any_errors_fatal 13355 1727096178.89030: checking for max_fail_percentage 13355 1727096178.89033: done checking for max_fail_percentage 13355 1727096178.89034: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.89034: done checking to see if all hosts have failed 13355 1727096178.89035: getting the remaining hosts for this loop 13355 1727096178.89036: done getting the remaining hosts for this loop 13355 1727096178.89040: getting the next task for host managed_node3 13355 1727096178.89048: done getting next task for host managed_node3 13355 1727096178.89051: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096178.89054: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.89074: getting variables 13355 1727096178.89076: in VariableManager get_vars() 13355 1727096178.89130: Calling all_inventory to load vars for managed_node3 13355 1727096178.89134: Calling groups_inventory to load vars for managed_node3 13355 1727096178.89136: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.89147: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.89151: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.89154: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.89782: done sending task result for task 0afff68d-5257-c514-593f-000000000083 13355 1727096178.89785: WORKER PROCESS EXITING 13355 1727096178.91634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096178.93333: done with get_vars() 13355 1727096178.93359: done getting variables 13355 1727096178.93415: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:56:18 -0400 (0:00:00.230) 0:00:28.195 ****** 13355 1727096178.93449: entering _queue_task() for managed_node3/package 13355 1727096178.93892: worker is 1 (out of 1 available) 13355 1727096178.93904: exiting _queue_task() for managed_node3/package 13355 1727096178.93916: done queuing things up, now waiting for results queue to drain 13355 1727096178.93917: waiting for pending results... 13355 1727096178.94155: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096178.94560: in run() - task 0afff68d-5257-c514-593f-000000000084 13355 1727096178.94625: variable 'ansible_search_path' from source: unknown 13355 1727096178.94635: variable 'ansible_search_path' from source: unknown 13355 1727096178.94680: calling self._execute() 13355 1727096178.94955: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096178.94969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096178.94984: variable 'omit' from source: magic vars 13355 1727096178.95735: variable 'ansible_distribution_major_version' from source: facts 13355 1727096178.95788: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096178.96054: variable 'network_state' from source: role '' defaults 13355 1727096178.96073: Evaluated conditional (network_state != {}): False 13355 1727096178.96173: when evaluation is False, skipping this task 13355 1727096178.96177: _execute() done 13355 1727096178.96179: dumping result to json 13355 1727096178.96181: done dumping result, returning 13355 1727096178.96188: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-c514-593f-000000000084] 13355 1727096178.96199: sending task result for task 0afff68d-5257-c514-593f-000000000084 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096178.96421: no more pending results, returning what we have 13355 1727096178.96425: results queue empty 13355 1727096178.96426: checking for any_errors_fatal 13355 1727096178.96433: done checking for any_errors_fatal 13355 1727096178.96434: checking for max_fail_percentage 13355 1727096178.96436: done checking for max_fail_percentage 13355 1727096178.96437: checking to see if all hosts have failed and the running result is not ok 13355 1727096178.96437: done checking to see if all hosts have failed 13355 1727096178.96438: getting the remaining hosts for this loop 13355 1727096178.96439: done getting the remaining hosts for this loop 13355 1727096178.96443: getting the next task for host managed_node3 13355 1727096178.96450: done getting next task for host managed_node3 13355 1727096178.96455: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096178.96458: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096178.96483: getting variables 13355 1727096178.96485: in VariableManager get_vars() 13355 1727096178.96543: Calling all_inventory to load vars for managed_node3 13355 1727096178.96546: Calling groups_inventory to load vars for managed_node3 13355 1727096178.96548: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096178.96560: Calling all_plugins_play to load vars for managed_node3 13355 1727096178.96564: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096178.96871: Calling groups_plugins_play to load vars for managed_node3 13355 1727096178.97634: done sending task result for task 0afff68d-5257-c514-593f-000000000084 13355 1727096178.97638: WORKER PROCESS EXITING 13355 1727096178.99773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096179.01748: done with get_vars() 13355 1727096179.01783: done getting variables 13355 1727096179.01842: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:56:19 -0400 (0:00:00.084) 0:00:28.279 ****** 13355 1727096179.01883: entering _queue_task() for managed_node3/package 13355 1727096179.02233: worker is 1 (out of 1 available) 13355 1727096179.02245: exiting _queue_task() for managed_node3/package 13355 1727096179.02257: done queuing things up, now waiting for results queue to drain 13355 1727096179.02258: waiting for pending results... 13355 1727096179.02560: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096179.02713: in run() - task 0afff68d-5257-c514-593f-000000000085 13355 1727096179.02732: variable 'ansible_search_path' from source: unknown 13355 1727096179.02740: variable 'ansible_search_path' from source: unknown 13355 1727096179.02782: calling self._execute() 13355 1727096179.02881: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.02893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.02909: variable 'omit' from source: magic vars 13355 1727096179.03283: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.03299: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096179.03423: variable 'network_state' from source: role '' defaults 13355 1727096179.03441: Evaluated conditional (network_state != {}): False 13355 1727096179.03449: when evaluation is False, skipping this task 13355 1727096179.03457: _execute() done 13355 1727096179.03464: dumping result to json 13355 1727096179.03473: done dumping result, returning 13355 1727096179.03484: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-c514-593f-000000000085] 13355 1727096179.03493: sending task result for task 0afff68d-5257-c514-593f-000000000085 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096179.03816: no more pending results, returning what we have 13355 1727096179.03820: results queue empty 13355 1727096179.03821: checking for any_errors_fatal 13355 1727096179.03826: done checking for any_errors_fatal 13355 1727096179.03827: checking for max_fail_percentage 13355 1727096179.03829: done checking for max_fail_percentage 13355 1727096179.03830: checking to see if all hosts have failed and the running result is not ok 13355 1727096179.03830: done checking to see if all hosts have failed 13355 1727096179.03832: getting the remaining hosts for this loop 13355 1727096179.03833: done getting the remaining hosts for this loop 13355 1727096179.03836: getting the next task for host managed_node3 13355 1727096179.03844: done getting next task for host managed_node3 13355 1727096179.03848: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096179.03851: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096179.03873: getting variables 13355 1727096179.03875: in VariableManager get_vars() 13355 1727096179.03929: Calling all_inventory to load vars for managed_node3 13355 1727096179.03932: Calling groups_inventory to load vars for managed_node3 13355 1727096179.03934: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096179.03945: Calling all_plugins_play to load vars for managed_node3 13355 1727096179.03948: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096179.03951: Calling groups_plugins_play to load vars for managed_node3 13355 1727096179.04481: done sending task result for task 0afff68d-5257-c514-593f-000000000085 13355 1727096179.04484: WORKER PROCESS EXITING 13355 1727096179.06155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096179.07636: done with get_vars() 13355 1727096179.07669: done getting variables 13355 1727096179.07725: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:56:19 -0400 (0:00:00.058) 0:00:28.338 ****** 13355 1727096179.07756: entering _queue_task() for managed_node3/service 13355 1727096179.08110: worker is 1 (out of 1 available) 13355 1727096179.08123: exiting _queue_task() for managed_node3/service 13355 1727096179.08136: done queuing things up, now waiting for results queue to drain 13355 1727096179.08137: waiting for pending results... 13355 1727096179.08424: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096179.08571: in run() - task 0afff68d-5257-c514-593f-000000000086 13355 1727096179.08595: variable 'ansible_search_path' from source: unknown 13355 1727096179.08603: variable 'ansible_search_path' from source: unknown 13355 1727096179.08644: calling self._execute() 13355 1727096179.08744: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.08755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.08770: variable 'omit' from source: magic vars 13355 1727096179.09156: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.09177: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096179.09300: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096179.09503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096179.11687: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096179.11761: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096179.11807: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096179.11848: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096179.11885: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096179.11969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.12025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.12057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.12109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.12129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.12184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.12219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.12250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.12296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.12321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.12365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.12424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.12427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.12471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.12491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.12678: variable 'network_connections' from source: task vars 13355 1727096179.12750: variable 'controller_profile' from source: play vars 13355 1727096179.12776: variable 'controller_profile' from source: play vars 13355 1727096179.12851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096179.13028: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096179.13075: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096179.13110: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096179.13143: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096179.13194: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096179.13221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096179.13291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.13294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096179.13333: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096179.13579: variable 'network_connections' from source: task vars 13355 1727096179.13590: variable 'controller_profile' from source: play vars 13355 1727096179.13664: variable 'controller_profile' from source: play vars 13355 1727096179.13696: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096179.13705: when evaluation is False, skipping this task 13355 1727096179.13724: _execute() done 13355 1727096179.13727: dumping result to json 13355 1727096179.13730: done dumping result, returning 13355 1727096179.13772: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000086] 13355 1727096179.13775: sending task result for task 0afff68d-5257-c514-593f-000000000086 13355 1727096179.14096: done sending task result for task 0afff68d-5257-c514-593f-000000000086 13355 1727096179.14106: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096179.14145: no more pending results, returning what we have 13355 1727096179.14148: results queue empty 13355 1727096179.14149: checking for any_errors_fatal 13355 1727096179.14153: done checking for any_errors_fatal 13355 1727096179.14154: checking for max_fail_percentage 13355 1727096179.14156: done checking for max_fail_percentage 13355 1727096179.14157: checking to see if all hosts have failed and the running result is not ok 13355 1727096179.14157: done checking to see if all hosts have failed 13355 1727096179.14158: getting the remaining hosts for this loop 13355 1727096179.14159: done getting the remaining hosts for this loop 13355 1727096179.14162: getting the next task for host managed_node3 13355 1727096179.14169: done getting next task for host managed_node3 13355 1727096179.14173: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096179.14176: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096179.14192: getting variables 13355 1727096179.14194: in VariableManager get_vars() 13355 1727096179.14244: Calling all_inventory to load vars for managed_node3 13355 1727096179.14247: Calling groups_inventory to load vars for managed_node3 13355 1727096179.14249: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096179.14258: Calling all_plugins_play to load vars for managed_node3 13355 1727096179.14262: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096179.14265: Calling groups_plugins_play to load vars for managed_node3 13355 1727096179.15647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096179.17372: done with get_vars() 13355 1727096179.17398: done getting variables 13355 1727096179.17461: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:56:19 -0400 (0:00:00.097) 0:00:28.435 ****** 13355 1727096179.17493: entering _queue_task() for managed_node3/service 13355 1727096179.17862: worker is 1 (out of 1 available) 13355 1727096179.17878: exiting _queue_task() for managed_node3/service 13355 1727096179.17899: done queuing things up, now waiting for results queue to drain 13355 1727096179.17900: waiting for pending results... 13355 1727096179.18170: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096179.18312: in run() - task 0afff68d-5257-c514-593f-000000000087 13355 1727096179.18336: variable 'ansible_search_path' from source: unknown 13355 1727096179.18345: variable 'ansible_search_path' from source: unknown 13355 1727096179.18391: calling self._execute() 13355 1727096179.18495: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.18508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.18526: variable 'omit' from source: magic vars 13355 1727096179.18927: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.18951: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096179.19130: variable 'network_provider' from source: set_fact 13355 1727096179.19140: variable 'network_state' from source: role '' defaults 13355 1727096179.19157: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13355 1727096179.19175: variable 'omit' from source: magic vars 13355 1727096179.19233: variable 'omit' from source: magic vars 13355 1727096179.19275: variable 'network_service_name' from source: role '' defaults 13355 1727096179.19343: variable 'network_service_name' from source: role '' defaults 13355 1727096179.19453: variable '__network_provider_setup' from source: role '' defaults 13355 1727096179.19492: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096179.19534: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096179.19548: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096179.19618: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096179.20074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096179.21686: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096179.21734: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096179.21762: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096179.21801: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096179.21823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096179.21885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.21906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.21926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.21954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.21966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.22001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.22017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.22036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.22062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.22075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.22224: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096179.22308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.22324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.22340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.22370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.22381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.22443: variable 'ansible_python' from source: facts 13355 1727096179.22463: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096179.22523: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096179.22579: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096179.22662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.22682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.22699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.22750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.22753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.22801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.22842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.22845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.22936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.22940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.23123: variable 'network_connections' from source: task vars 13355 1727096179.23126: variable 'controller_profile' from source: play vars 13355 1727096179.23129: variable 'controller_profile' from source: play vars 13355 1727096179.23372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096179.23379: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096179.23427: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096179.23470: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096179.23522: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096179.23584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096179.23612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096179.23642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.23676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096179.23722: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096179.23994: variable 'network_connections' from source: task vars 13355 1727096179.24050: variable 'controller_profile' from source: play vars 13355 1727096179.24105: variable 'controller_profile' from source: play vars 13355 1727096179.24108: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096179.24186: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096179.24399: variable 'network_connections' from source: task vars 13355 1727096179.24402: variable 'controller_profile' from source: play vars 13355 1727096179.24438: variable 'controller_profile' from source: play vars 13355 1727096179.24456: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096179.24523: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096179.24711: variable 'network_connections' from source: task vars 13355 1727096179.24714: variable 'controller_profile' from source: play vars 13355 1727096179.24763: variable 'controller_profile' from source: play vars 13355 1727096179.24802: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096179.24844: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096179.24850: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096179.24894: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096179.25029: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096179.25333: variable 'network_connections' from source: task vars 13355 1727096179.25339: variable 'controller_profile' from source: play vars 13355 1727096179.25383: variable 'controller_profile' from source: play vars 13355 1727096179.25390: variable 'ansible_distribution' from source: facts 13355 1727096179.25393: variable '__network_rh_distros' from source: role '' defaults 13355 1727096179.25399: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.25410: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096179.25525: variable 'ansible_distribution' from source: facts 13355 1727096179.25528: variable '__network_rh_distros' from source: role '' defaults 13355 1727096179.25533: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.25544: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096179.25658: variable 'ansible_distribution' from source: facts 13355 1727096179.25662: variable '__network_rh_distros' from source: role '' defaults 13355 1727096179.25664: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.25696: variable 'network_provider' from source: set_fact 13355 1727096179.25712: variable 'omit' from source: magic vars 13355 1727096179.25736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096179.25760: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096179.25774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096179.25791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096179.25799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096179.25822: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096179.25825: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.25827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.25900: Set connection var ansible_shell_executable to /bin/sh 13355 1727096179.25907: Set connection var ansible_shell_type to sh 13355 1727096179.25914: Set connection var ansible_pipelining to False 13355 1727096179.25917: Set connection var ansible_connection to ssh 13355 1727096179.25923: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096179.25928: Set connection var ansible_timeout to 10 13355 1727096179.25948: variable 'ansible_shell_executable' from source: unknown 13355 1727096179.25951: variable 'ansible_connection' from source: unknown 13355 1727096179.25954: variable 'ansible_module_compression' from source: unknown 13355 1727096179.25959: variable 'ansible_shell_type' from source: unknown 13355 1727096179.25961: variable 'ansible_shell_executable' from source: unknown 13355 1727096179.25963: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.25965: variable 'ansible_pipelining' from source: unknown 13355 1727096179.25969: variable 'ansible_timeout' from source: unknown 13355 1727096179.25971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.26046: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096179.26054: variable 'omit' from source: magic vars 13355 1727096179.26060: starting attempt loop 13355 1727096179.26063: running the handler 13355 1727096179.26120: variable 'ansible_facts' from source: unknown 13355 1727096179.26840: _low_level_execute_command(): starting 13355 1727096179.26873: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096179.27498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096179.27515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096179.27529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096179.27548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096179.27643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096179.27666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096179.27688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096179.27747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096179.29454: stdout chunk (state=3): >>>/root <<< 13355 1727096179.29583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096179.29598: stderr chunk (state=3): >>><<< 13355 1727096179.29601: stdout chunk (state=3): >>><<< 13355 1727096179.29642: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096179.29646: _low_level_execute_command(): starting 13355 1727096179.29664: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162 `" && echo ansible-tmp-1727096179.2962942-14638-65823296460162="` echo /root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162 `" ) && sleep 0' 13355 1727096179.30298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096179.30309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096179.30314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096179.30318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096179.30320: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096179.30370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096179.30374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096179.30376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096179.30424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096179.32417: stdout chunk (state=3): >>>ansible-tmp-1727096179.2962942-14638-65823296460162=/root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162 <<< 13355 1727096179.32521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096179.32550: stderr chunk (state=3): >>><<< 13355 1727096179.32553: stdout chunk (state=3): >>><<< 13355 1727096179.32572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096179.2962942-14638-65823296460162=/root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096179.32605: variable 'ansible_module_compression' from source: unknown 13355 1727096179.32646: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13355 1727096179.32699: variable 'ansible_facts' from source: unknown 13355 1727096179.32837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/AnsiballZ_systemd.py 13355 1727096179.32942: Sending initial data 13355 1727096179.32946: Sent initial data (155 bytes) 13355 1727096179.33630: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096179.33658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096179.33733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096179.35361: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096179.35389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096179.35422: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpqrq01edh /root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/AnsiballZ_systemd.py <<< 13355 1727096179.35435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/AnsiballZ_systemd.py" <<< 13355 1727096179.35459: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpqrq01edh" to remote "/root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/AnsiballZ_systemd.py" <<< 13355 1727096179.36460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096179.36501: stderr chunk (state=3): >>><<< 13355 1727096179.36504: stdout chunk (state=3): >>><<< 13355 1727096179.36546: done transferring module to remote 13355 1727096179.36558: _low_level_execute_command(): starting 13355 1727096179.36561: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/ /root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/AnsiballZ_systemd.py && sleep 0' 13355 1727096179.37020: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096179.37024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096179.37027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096179.37029: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096179.37032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096179.37086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096179.37089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096179.37126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096179.38969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096179.38993: stderr chunk (state=3): >>><<< 13355 1727096179.38996: stdout chunk (state=3): >>><<< 13355 1727096179.39014: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096179.39017: _low_level_execute_command(): starting 13355 1727096179.39021: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/AnsiballZ_systemd.py && sleep 0' 13355 1727096179.39478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096179.39482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096179.39484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096179.39486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096179.39533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096179.39536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096179.39538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096179.39587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096179.69703: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10506240", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297968128", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "987857000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 13355 1727096179.69719: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd<<< 13355 1727096179.69727: stdout chunk (state=3): >>>-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13355 1727096179.71881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096179.71906: stderr chunk (state=3): >>><<< 13355 1727096179.71909: stdout chunk (state=3): >>><<< 13355 1727096179.71928: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10506240", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297968128", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "987857000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096179.72053: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096179.72071: _low_level_execute_command(): starting 13355 1727096179.72077: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096179.2962942-14638-65823296460162/ > /dev/null 2>&1 && sleep 0' 13355 1727096179.72548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096179.72551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096179.72554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096179.72556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096179.72613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096179.72616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096179.72623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096179.72657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096179.74532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096179.74558: stderr chunk (state=3): >>><<< 13355 1727096179.74563: stdout chunk (state=3): >>><<< 13355 1727096179.74582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096179.74589: handler run complete 13355 1727096179.74629: attempt loop complete, returning result 13355 1727096179.74633: _execute() done 13355 1727096179.74635: dumping result to json 13355 1727096179.74648: done dumping result, returning 13355 1727096179.74658: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-c514-593f-000000000087] 13355 1727096179.74664: sending task result for task 0afff68d-5257-c514-593f-000000000087 13355 1727096179.74866: done sending task result for task 0afff68d-5257-c514-593f-000000000087 13355 1727096179.74871: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096179.74930: no more pending results, returning what we have 13355 1727096179.74933: results queue empty 13355 1727096179.74934: checking for any_errors_fatal 13355 1727096179.74940: done checking for any_errors_fatal 13355 1727096179.74940: checking for max_fail_percentage 13355 1727096179.74942: done checking for max_fail_percentage 13355 1727096179.74943: checking to see if all hosts have failed and the running result is not ok 13355 1727096179.74943: done checking to see if all hosts have failed 13355 1727096179.74944: getting the remaining hosts for this loop 13355 1727096179.74945: done getting the remaining hosts for this loop 13355 1727096179.74948: getting the next task for host managed_node3 13355 1727096179.74954: done getting next task for host managed_node3 13355 1727096179.74958: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096179.74960: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096179.74973: getting variables 13355 1727096179.74975: in VariableManager get_vars() 13355 1727096179.75058: Calling all_inventory to load vars for managed_node3 13355 1727096179.75060: Calling groups_inventory to load vars for managed_node3 13355 1727096179.75063: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096179.75073: Calling all_plugins_play to load vars for managed_node3 13355 1727096179.75076: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096179.75078: Calling groups_plugins_play to load vars for managed_node3 13355 1727096179.75844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096179.76708: done with get_vars() 13355 1727096179.76736: done getting variables 13355 1727096179.76786: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:56:19 -0400 (0:00:00.593) 0:00:29.028 ****** 13355 1727096179.76813: entering _queue_task() for managed_node3/service 13355 1727096179.77084: worker is 1 (out of 1 available) 13355 1727096179.77099: exiting _queue_task() for managed_node3/service 13355 1727096179.77112: done queuing things up, now waiting for results queue to drain 13355 1727096179.77114: waiting for pending results... 13355 1727096179.77301: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096179.77397: in run() - task 0afff68d-5257-c514-593f-000000000088 13355 1727096179.77409: variable 'ansible_search_path' from source: unknown 13355 1727096179.77413: variable 'ansible_search_path' from source: unknown 13355 1727096179.77445: calling self._execute() 13355 1727096179.77526: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.77530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.77539: variable 'omit' from source: magic vars 13355 1727096179.77834: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.77844: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096179.77933: variable 'network_provider' from source: set_fact 13355 1727096179.77937: Evaluated conditional (network_provider == "nm"): True 13355 1727096179.78004: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096179.78065: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096179.78190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096179.79681: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096179.79725: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096179.79755: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096179.79785: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096179.79805: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096179.80127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.80149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.80171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.80206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.80217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.80252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.80274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.80295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.80319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.80330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.80358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.80378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.80401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.80422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.80432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.80532: variable 'network_connections' from source: task vars 13355 1727096179.80543: variable 'controller_profile' from source: play vars 13355 1727096179.80597: variable 'controller_profile' from source: play vars 13355 1727096179.80651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096179.80772: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096179.80800: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096179.80822: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096179.80845: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096179.80881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096179.80897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096179.80913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.80929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096179.80973: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096179.81126: variable 'network_connections' from source: task vars 13355 1727096179.81130: variable 'controller_profile' from source: play vars 13355 1727096179.81180: variable 'controller_profile' from source: play vars 13355 1727096179.81202: Evaluated conditional (__network_wpa_supplicant_required): False 13355 1727096179.81205: when evaluation is False, skipping this task 13355 1727096179.81208: _execute() done 13355 1727096179.81210: dumping result to json 13355 1727096179.81213: done dumping result, returning 13355 1727096179.81221: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-c514-593f-000000000088] 13355 1727096179.81233: sending task result for task 0afff68d-5257-c514-593f-000000000088 13355 1727096179.81314: done sending task result for task 0afff68d-5257-c514-593f-000000000088 13355 1727096179.81317: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13355 1727096179.81360: no more pending results, returning what we have 13355 1727096179.81364: results queue empty 13355 1727096179.81364: checking for any_errors_fatal 13355 1727096179.81388: done checking for any_errors_fatal 13355 1727096179.81389: checking for max_fail_percentage 13355 1727096179.81391: done checking for max_fail_percentage 13355 1727096179.81392: checking to see if all hosts have failed and the running result is not ok 13355 1727096179.81392: done checking to see if all hosts have failed 13355 1727096179.81393: getting the remaining hosts for this loop 13355 1727096179.81394: done getting the remaining hosts for this loop 13355 1727096179.81398: getting the next task for host managed_node3 13355 1727096179.81405: done getting next task for host managed_node3 13355 1727096179.81409: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096179.81412: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096179.81428: getting variables 13355 1727096179.81430: in VariableManager get_vars() 13355 1727096179.81490: Calling all_inventory to load vars for managed_node3 13355 1727096179.81492: Calling groups_inventory to load vars for managed_node3 13355 1727096179.81495: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096179.81504: Calling all_plugins_play to load vars for managed_node3 13355 1727096179.81507: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096179.81509: Calling groups_plugins_play to load vars for managed_node3 13355 1727096179.82431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096179.83292: done with get_vars() 13355 1727096179.83317: done getting variables 13355 1727096179.83366: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:56:19 -0400 (0:00:00.065) 0:00:29.094 ****** 13355 1727096179.83393: entering _queue_task() for managed_node3/service 13355 1727096179.83665: worker is 1 (out of 1 available) 13355 1727096179.83682: exiting _queue_task() for managed_node3/service 13355 1727096179.83695: done queuing things up, now waiting for results queue to drain 13355 1727096179.83697: waiting for pending results... 13355 1727096179.83887: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096179.83983: in run() - task 0afff68d-5257-c514-593f-000000000089 13355 1727096179.83996: variable 'ansible_search_path' from source: unknown 13355 1727096179.83999: variable 'ansible_search_path' from source: unknown 13355 1727096179.84030: calling self._execute() 13355 1727096179.84114: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.84117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.84127: variable 'omit' from source: magic vars 13355 1727096179.84417: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.84426: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096179.84510: variable 'network_provider' from source: set_fact 13355 1727096179.84515: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096179.84518: when evaluation is False, skipping this task 13355 1727096179.84520: _execute() done 13355 1727096179.84523: dumping result to json 13355 1727096179.84526: done dumping result, returning 13355 1727096179.84533: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-c514-593f-000000000089] 13355 1727096179.84538: sending task result for task 0afff68d-5257-c514-593f-000000000089 13355 1727096179.84629: done sending task result for task 0afff68d-5257-c514-593f-000000000089 13355 1727096179.84632: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096179.84683: no more pending results, returning what we have 13355 1727096179.84687: results queue empty 13355 1727096179.84688: checking for any_errors_fatal 13355 1727096179.84694: done checking for any_errors_fatal 13355 1727096179.84695: checking for max_fail_percentage 13355 1727096179.84697: done checking for max_fail_percentage 13355 1727096179.84698: checking to see if all hosts have failed and the running result is not ok 13355 1727096179.84698: done checking to see if all hosts have failed 13355 1727096179.84699: getting the remaining hosts for this loop 13355 1727096179.84700: done getting the remaining hosts for this loop 13355 1727096179.84703: getting the next task for host managed_node3 13355 1727096179.84709: done getting next task for host managed_node3 13355 1727096179.84713: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096179.84716: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096179.84734: getting variables 13355 1727096179.84736: in VariableManager get_vars() 13355 1727096179.84790: Calling all_inventory to load vars for managed_node3 13355 1727096179.84792: Calling groups_inventory to load vars for managed_node3 13355 1727096179.84795: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096179.84804: Calling all_plugins_play to load vars for managed_node3 13355 1727096179.84806: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096179.84809: Calling groups_plugins_play to load vars for managed_node3 13355 1727096179.85611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096179.86481: done with get_vars() 13355 1727096179.86507: done getting variables 13355 1727096179.86552: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:56:19 -0400 (0:00:00.031) 0:00:29.126 ****** 13355 1727096179.86583: entering _queue_task() for managed_node3/copy 13355 1727096179.86863: worker is 1 (out of 1 available) 13355 1727096179.86879: exiting _queue_task() for managed_node3/copy 13355 1727096179.86891: done queuing things up, now waiting for results queue to drain 13355 1727096179.86893: waiting for pending results... 13355 1727096179.87080: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096179.87169: in run() - task 0afff68d-5257-c514-593f-00000000008a 13355 1727096179.87182: variable 'ansible_search_path' from source: unknown 13355 1727096179.87185: variable 'ansible_search_path' from source: unknown 13355 1727096179.87215: calling self._execute() 13355 1727096179.87294: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.87299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.87307: variable 'omit' from source: magic vars 13355 1727096179.87597: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.87607: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096179.87692: variable 'network_provider' from source: set_fact 13355 1727096179.87696: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096179.87699: when evaluation is False, skipping this task 13355 1727096179.87701: _execute() done 13355 1727096179.87704: dumping result to json 13355 1727096179.87708: done dumping result, returning 13355 1727096179.87717: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-c514-593f-00000000008a] 13355 1727096179.87722: sending task result for task 0afff68d-5257-c514-593f-00000000008a 13355 1727096179.87814: done sending task result for task 0afff68d-5257-c514-593f-00000000008a 13355 1727096179.87817: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096179.87866: no more pending results, returning what we have 13355 1727096179.87873: results queue empty 13355 1727096179.87873: checking for any_errors_fatal 13355 1727096179.87880: done checking for any_errors_fatal 13355 1727096179.87880: checking for max_fail_percentage 13355 1727096179.87882: done checking for max_fail_percentage 13355 1727096179.87883: checking to see if all hosts have failed and the running result is not ok 13355 1727096179.87883: done checking to see if all hosts have failed 13355 1727096179.87884: getting the remaining hosts for this loop 13355 1727096179.87885: done getting the remaining hosts for this loop 13355 1727096179.87889: getting the next task for host managed_node3 13355 1727096179.87896: done getting next task for host managed_node3 13355 1727096179.87901: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096179.87904: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096179.87922: getting variables 13355 1727096179.87923: in VariableManager get_vars() 13355 1727096179.87982: Calling all_inventory to load vars for managed_node3 13355 1727096179.87985: Calling groups_inventory to load vars for managed_node3 13355 1727096179.87987: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096179.87996: Calling all_plugins_play to load vars for managed_node3 13355 1727096179.87999: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096179.88001: Calling groups_plugins_play to load vars for managed_node3 13355 1727096179.88924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096179.89785: done with get_vars() 13355 1727096179.89810: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:56:19 -0400 (0:00:00.033) 0:00:29.159 ****** 13355 1727096179.89886: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096179.90162: worker is 1 (out of 1 available) 13355 1727096179.90177: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096179.90190: done queuing things up, now waiting for results queue to drain 13355 1727096179.90192: waiting for pending results... 13355 1727096179.90378: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096179.90478: in run() - task 0afff68d-5257-c514-593f-00000000008b 13355 1727096179.90490: variable 'ansible_search_path' from source: unknown 13355 1727096179.90494: variable 'ansible_search_path' from source: unknown 13355 1727096179.90524: calling self._execute() 13355 1727096179.90604: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096179.90609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096179.90617: variable 'omit' from source: magic vars 13355 1727096179.90907: variable 'ansible_distribution_major_version' from source: facts 13355 1727096179.90917: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096179.90923: variable 'omit' from source: magic vars 13355 1727096179.90963: variable 'omit' from source: magic vars 13355 1727096179.91085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096179.92607: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096179.92654: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096179.92685: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096179.92711: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096179.92734: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096179.92801: variable 'network_provider' from source: set_fact 13355 1727096179.92904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096179.92939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096179.92959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096179.92992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096179.93003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096179.93062: variable 'omit' from source: magic vars 13355 1727096179.93145: variable 'omit' from source: magic vars 13355 1727096179.93222: variable 'network_connections' from source: task vars 13355 1727096179.93232: variable 'controller_profile' from source: play vars 13355 1727096179.93374: variable 'controller_profile' from source: play vars 13355 1727096179.93380: variable 'omit' from source: magic vars 13355 1727096179.93390: variable '__lsr_ansible_managed' from source: task vars 13355 1727096179.93432: variable '__lsr_ansible_managed' from source: task vars 13355 1727096179.93563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13355 1727096179.93718: Loaded config def from plugin (lookup/template) 13355 1727096179.93721: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13355 1727096179.93739: File lookup term: get_ansible_managed.j2 13355 1727096179.93742: variable 'ansible_search_path' from source: unknown 13355 1727096179.93745: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13355 1727096179.93758: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13355 1727096179.93775: variable 'ansible_search_path' from source: unknown 13355 1727096180.02981: variable 'ansible_managed' from source: unknown 13355 1727096180.03033: variable 'omit' from source: magic vars 13355 1727096180.03076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096180.03106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096180.03128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096180.03149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096180.03171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096180.03199: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096180.03208: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.03219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.03321: Set connection var ansible_shell_executable to /bin/sh 13355 1727096180.03334: Set connection var ansible_shell_type to sh 13355 1727096180.03346: Set connection var ansible_pipelining to False 13355 1727096180.03360: Set connection var ansible_connection to ssh 13355 1727096180.03375: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096180.03388: Set connection var ansible_timeout to 10 13355 1727096180.03419: variable 'ansible_shell_executable' from source: unknown 13355 1727096180.03473: variable 'ansible_connection' from source: unknown 13355 1727096180.03477: variable 'ansible_module_compression' from source: unknown 13355 1727096180.03480: variable 'ansible_shell_type' from source: unknown 13355 1727096180.03483: variable 'ansible_shell_executable' from source: unknown 13355 1727096180.03486: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.03489: variable 'ansible_pipelining' from source: unknown 13355 1727096180.03491: variable 'ansible_timeout' from source: unknown 13355 1727096180.03494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.03616: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096180.03642: variable 'omit' from source: magic vars 13355 1727096180.03674: starting attempt loop 13355 1727096180.03677: running the handler 13355 1727096180.03683: _low_level_execute_command(): starting 13355 1727096180.03798: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096180.04471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096180.04489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096180.04511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096180.04529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096180.04608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096180.06313: stdout chunk (state=3): >>>/root <<< 13355 1727096180.06407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096180.06437: stderr chunk (state=3): >>><<< 13355 1727096180.06440: stdout chunk (state=3): >>><<< 13355 1727096180.06467: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096180.06481: _low_level_execute_command(): starting 13355 1727096180.06487: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588 `" && echo ansible-tmp-1727096180.0646555-14655-139051804405588="` echo /root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588 `" ) && sleep 0' 13355 1727096180.06952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096180.06958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096180.06960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.06963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096180.06965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.07019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096180.07022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096180.07029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096180.07065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096180.09071: stdout chunk (state=3): >>>ansible-tmp-1727096180.0646555-14655-139051804405588=/root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588 <<< 13355 1727096180.09164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096180.09195: stderr chunk (state=3): >>><<< 13355 1727096180.09198: stdout chunk (state=3): >>><<< 13355 1727096180.09215: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096180.0646555-14655-139051804405588=/root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096180.09254: variable 'ansible_module_compression' from source: unknown 13355 1727096180.09294: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13355 1727096180.09332: variable 'ansible_facts' from source: unknown 13355 1727096180.09427: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/AnsiballZ_network_connections.py 13355 1727096180.09532: Sending initial data 13355 1727096180.09536: Sent initial data (168 bytes) 13355 1727096180.10011: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096180.10014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.10021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096180.10023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096180.10026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.10070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096180.10074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096180.10086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096180.10129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096180.11783: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096180.11810: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096180.11850: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpxuv89tfm /root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/AnsiballZ_network_connections.py <<< 13355 1727096180.11853: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/AnsiballZ_network_connections.py" <<< 13355 1727096180.11882: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpxuv89tfm" to remote "/root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/AnsiballZ_network_connections.py" <<< 13355 1727096180.11889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/AnsiballZ_network_connections.py" <<< 13355 1727096180.12569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096180.12610: stderr chunk (state=3): >>><<< 13355 1727096180.12613: stdout chunk (state=3): >>><<< 13355 1727096180.12634: done transferring module to remote 13355 1727096180.12646: _low_level_execute_command(): starting 13355 1727096180.12649: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/ /root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/AnsiballZ_network_connections.py && sleep 0' 13355 1727096180.13120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096180.13124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096180.13126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.13128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096180.13130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.13193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096180.13197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096180.13201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096180.13234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096180.15122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096180.15144: stderr chunk (state=3): >>><<< 13355 1727096180.15147: stdout chunk (state=3): >>><<< 13355 1727096180.15164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096180.15170: _low_level_execute_command(): starting 13355 1727096180.15173: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/AnsiballZ_network_connections.py && sleep 0' 13355 1727096180.15636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096180.15639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.15642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096180.15644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.15701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096180.15713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096180.15717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096180.15752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096180.62362: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7is_hymp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7is_hymp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/0719f6e4-c7c1-4297-9228-210ef46db39c: error=unknown <<< 13355 1727096180.62540: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13355 1727096180.64476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096180.64502: stderr chunk (state=3): >>><<< 13355 1727096180.64505: stdout chunk (state=3): >>><<< 13355 1727096180.64524: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7is_hymp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7is_hymp/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/0719f6e4-c7c1-4297-9228-210ef46db39c: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096180.64552: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096180.64565: _low_level_execute_command(): starting 13355 1727096180.64570: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096180.0646555-14655-139051804405588/ > /dev/null 2>&1 && sleep 0' 13355 1727096180.65017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096180.65021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.65034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.65098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096180.65102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096180.65104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096180.65144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096180.67025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096180.67052: stderr chunk (state=3): >>><<< 13355 1727096180.67056: stdout chunk (state=3): >>><<< 13355 1727096180.67075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096180.67086: handler run complete 13355 1727096180.67105: attempt loop complete, returning result 13355 1727096180.67108: _execute() done 13355 1727096180.67110: dumping result to json 13355 1727096180.67114: done dumping result, returning 13355 1727096180.67122: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-c514-593f-00000000008b] 13355 1727096180.67124: sending task result for task 0afff68d-5257-c514-593f-00000000008b 13355 1727096180.67221: done sending task result for task 0afff68d-5257-c514-593f-00000000008b 13355 1727096180.67224: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13355 1727096180.67312: no more pending results, returning what we have 13355 1727096180.67315: results queue empty 13355 1727096180.67316: checking for any_errors_fatal 13355 1727096180.67322: done checking for any_errors_fatal 13355 1727096180.67322: checking for max_fail_percentage 13355 1727096180.67324: done checking for max_fail_percentage 13355 1727096180.67325: checking to see if all hosts have failed and the running result is not ok 13355 1727096180.67326: done checking to see if all hosts have failed 13355 1727096180.67326: getting the remaining hosts for this loop 13355 1727096180.67327: done getting the remaining hosts for this loop 13355 1727096180.67331: getting the next task for host managed_node3 13355 1727096180.67336: done getting next task for host managed_node3 13355 1727096180.67340: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096180.67342: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096180.67351: getting variables 13355 1727096180.67352: in VariableManager get_vars() 13355 1727096180.67401: Calling all_inventory to load vars for managed_node3 13355 1727096180.67403: Calling groups_inventory to load vars for managed_node3 13355 1727096180.67405: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096180.67414: Calling all_plugins_play to load vars for managed_node3 13355 1727096180.67417: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096180.67419: Calling groups_plugins_play to load vars for managed_node3 13355 1727096180.68298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096180.69282: done with get_vars() 13355 1727096180.69301: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:56:20 -0400 (0:00:00.794) 0:00:29.954 ****** 13355 1727096180.69372: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096180.69646: worker is 1 (out of 1 available) 13355 1727096180.69662: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096180.69679: done queuing things up, now waiting for results queue to drain 13355 1727096180.69680: waiting for pending results... 13355 1727096180.69859: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096180.69949: in run() - task 0afff68d-5257-c514-593f-00000000008c 13355 1727096180.69962: variable 'ansible_search_path' from source: unknown 13355 1727096180.69965: variable 'ansible_search_path' from source: unknown 13355 1727096180.69996: calling self._execute() 13355 1727096180.70076: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.70080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.70089: variable 'omit' from source: magic vars 13355 1727096180.70382: variable 'ansible_distribution_major_version' from source: facts 13355 1727096180.70392: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096180.70479: variable 'network_state' from source: role '' defaults 13355 1727096180.70487: Evaluated conditional (network_state != {}): False 13355 1727096180.70491: when evaluation is False, skipping this task 13355 1727096180.70493: _execute() done 13355 1727096180.70496: dumping result to json 13355 1727096180.70498: done dumping result, returning 13355 1727096180.70506: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-c514-593f-00000000008c] 13355 1727096180.70511: sending task result for task 0afff68d-5257-c514-593f-00000000008c 13355 1727096180.70601: done sending task result for task 0afff68d-5257-c514-593f-00000000008c 13355 1727096180.70604: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096180.70657: no more pending results, returning what we have 13355 1727096180.70661: results queue empty 13355 1727096180.70662: checking for any_errors_fatal 13355 1727096180.70672: done checking for any_errors_fatal 13355 1727096180.70673: checking for max_fail_percentage 13355 1727096180.70675: done checking for max_fail_percentage 13355 1727096180.70676: checking to see if all hosts have failed and the running result is not ok 13355 1727096180.70676: done checking to see if all hosts have failed 13355 1727096180.70677: getting the remaining hosts for this loop 13355 1727096180.70678: done getting the remaining hosts for this loop 13355 1727096180.70682: getting the next task for host managed_node3 13355 1727096180.70690: done getting next task for host managed_node3 13355 1727096180.70694: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096180.70697: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096180.70717: getting variables 13355 1727096180.70718: in VariableManager get_vars() 13355 1727096180.70775: Calling all_inventory to load vars for managed_node3 13355 1727096180.70778: Calling groups_inventory to load vars for managed_node3 13355 1727096180.70780: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096180.70789: Calling all_plugins_play to load vars for managed_node3 13355 1727096180.70791: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096180.70794: Calling groups_plugins_play to load vars for managed_node3 13355 1727096180.71589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096180.72462: done with get_vars() 13355 1727096180.72488: done getting variables 13355 1727096180.72536: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:56:20 -0400 (0:00:00.031) 0:00:29.986 ****** 13355 1727096180.72565: entering _queue_task() for managed_node3/debug 13355 1727096180.72846: worker is 1 (out of 1 available) 13355 1727096180.72863: exiting _queue_task() for managed_node3/debug 13355 1727096180.72878: done queuing things up, now waiting for results queue to drain 13355 1727096180.72879: waiting for pending results... 13355 1727096180.73065: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096180.73159: in run() - task 0afff68d-5257-c514-593f-00000000008d 13355 1727096180.73171: variable 'ansible_search_path' from source: unknown 13355 1727096180.73176: variable 'ansible_search_path' from source: unknown 13355 1727096180.73206: calling self._execute() 13355 1727096180.73286: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.73291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.73298: variable 'omit' from source: magic vars 13355 1727096180.73586: variable 'ansible_distribution_major_version' from source: facts 13355 1727096180.73596: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096180.73601: variable 'omit' from source: magic vars 13355 1727096180.73651: variable 'omit' from source: magic vars 13355 1727096180.73676: variable 'omit' from source: magic vars 13355 1727096180.73710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096180.73737: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096180.73752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096180.73770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096180.73780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096180.73805: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096180.73808: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.73811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.73884: Set connection var ansible_shell_executable to /bin/sh 13355 1727096180.73890: Set connection var ansible_shell_type to sh 13355 1727096180.73895: Set connection var ansible_pipelining to False 13355 1727096180.73900: Set connection var ansible_connection to ssh 13355 1727096180.73905: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096180.73910: Set connection var ansible_timeout to 10 13355 1727096180.73929: variable 'ansible_shell_executable' from source: unknown 13355 1727096180.73932: variable 'ansible_connection' from source: unknown 13355 1727096180.73935: variable 'ansible_module_compression' from source: unknown 13355 1727096180.73937: variable 'ansible_shell_type' from source: unknown 13355 1727096180.73939: variable 'ansible_shell_executable' from source: unknown 13355 1727096180.73941: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.73944: variable 'ansible_pipelining' from source: unknown 13355 1727096180.73947: variable 'ansible_timeout' from source: unknown 13355 1727096180.73951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.74054: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096180.74064: variable 'omit' from source: magic vars 13355 1727096180.74070: starting attempt loop 13355 1727096180.74073: running the handler 13355 1727096180.74169: variable '__network_connections_result' from source: set_fact 13355 1727096180.74211: handler run complete 13355 1727096180.74223: attempt loop complete, returning result 13355 1727096180.74226: _execute() done 13355 1727096180.74229: dumping result to json 13355 1727096180.74232: done dumping result, returning 13355 1727096180.74241: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-c514-593f-00000000008d] 13355 1727096180.74245: sending task result for task 0afff68d-5257-c514-593f-00000000008d 13355 1727096180.74330: done sending task result for task 0afff68d-5257-c514-593f-00000000008d 13355 1727096180.74333: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 13355 1727096180.74399: no more pending results, returning what we have 13355 1727096180.74402: results queue empty 13355 1727096180.74403: checking for any_errors_fatal 13355 1727096180.74409: done checking for any_errors_fatal 13355 1727096180.74409: checking for max_fail_percentage 13355 1727096180.74411: done checking for max_fail_percentage 13355 1727096180.74412: checking to see if all hosts have failed and the running result is not ok 13355 1727096180.74412: done checking to see if all hosts have failed 13355 1727096180.74413: getting the remaining hosts for this loop 13355 1727096180.74415: done getting the remaining hosts for this loop 13355 1727096180.74418: getting the next task for host managed_node3 13355 1727096180.74425: done getting next task for host managed_node3 13355 1727096180.74429: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096180.74432: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096180.74442: getting variables 13355 1727096180.74443: in VariableManager get_vars() 13355 1727096180.74497: Calling all_inventory to load vars for managed_node3 13355 1727096180.74500: Calling groups_inventory to load vars for managed_node3 13355 1727096180.74502: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096180.74511: Calling all_plugins_play to load vars for managed_node3 13355 1727096180.74513: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096180.74516: Calling groups_plugins_play to load vars for managed_node3 13355 1727096180.75443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096180.80420: done with get_vars() 13355 1727096180.80445: done getting variables 13355 1727096180.80488: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:56:20 -0400 (0:00:00.079) 0:00:30.065 ****** 13355 1727096180.80511: entering _queue_task() for managed_node3/debug 13355 1727096180.80802: worker is 1 (out of 1 available) 13355 1727096180.80818: exiting _queue_task() for managed_node3/debug 13355 1727096180.80832: done queuing things up, now waiting for results queue to drain 13355 1727096180.80834: waiting for pending results... 13355 1727096180.81023: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096180.81136: in run() - task 0afff68d-5257-c514-593f-00000000008e 13355 1727096180.81148: variable 'ansible_search_path' from source: unknown 13355 1727096180.81152: variable 'ansible_search_path' from source: unknown 13355 1727096180.81188: calling self._execute() 13355 1727096180.81264: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.81271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.81287: variable 'omit' from source: magic vars 13355 1727096180.81571: variable 'ansible_distribution_major_version' from source: facts 13355 1727096180.81580: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096180.81587: variable 'omit' from source: magic vars 13355 1727096180.81630: variable 'omit' from source: magic vars 13355 1727096180.81660: variable 'omit' from source: magic vars 13355 1727096180.81692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096180.81773: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096180.81777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096180.81780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096180.81783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096180.81789: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096180.81792: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.81796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.81875: Set connection var ansible_shell_executable to /bin/sh 13355 1727096180.81881: Set connection var ansible_shell_type to sh 13355 1727096180.81886: Set connection var ansible_pipelining to False 13355 1727096180.81890: Set connection var ansible_connection to ssh 13355 1727096180.81896: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096180.81902: Set connection var ansible_timeout to 10 13355 1727096180.81923: variable 'ansible_shell_executable' from source: unknown 13355 1727096180.81926: variable 'ansible_connection' from source: unknown 13355 1727096180.81931: variable 'ansible_module_compression' from source: unknown 13355 1727096180.81933: variable 'ansible_shell_type' from source: unknown 13355 1727096180.81936: variable 'ansible_shell_executable' from source: unknown 13355 1727096180.81939: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.81941: variable 'ansible_pipelining' from source: unknown 13355 1727096180.81943: variable 'ansible_timeout' from source: unknown 13355 1727096180.81945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.82090: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096180.82094: variable 'omit' from source: magic vars 13355 1727096180.82273: starting attempt loop 13355 1727096180.82276: running the handler 13355 1727096180.82278: variable '__network_connections_result' from source: set_fact 13355 1727096180.82280: variable '__network_connections_result' from source: set_fact 13355 1727096180.82382: handler run complete 13355 1727096180.82409: attempt loop complete, returning result 13355 1727096180.82424: _execute() done 13355 1727096180.82433: dumping result to json 13355 1727096180.82441: done dumping result, returning 13355 1727096180.82454: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-c514-593f-00000000008e] 13355 1727096180.82464: sending task result for task 0afff68d-5257-c514-593f-00000000008e 13355 1727096180.82876: done sending task result for task 0afff68d-5257-c514-593f-00000000008e 13355 1727096180.82879: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13355 1727096180.82973: no more pending results, returning what we have 13355 1727096180.82977: results queue empty 13355 1727096180.82978: checking for any_errors_fatal 13355 1727096180.82984: done checking for any_errors_fatal 13355 1727096180.82985: checking for max_fail_percentage 13355 1727096180.82987: done checking for max_fail_percentage 13355 1727096180.82989: checking to see if all hosts have failed and the running result is not ok 13355 1727096180.82990: done checking to see if all hosts have failed 13355 1727096180.82991: getting the remaining hosts for this loop 13355 1727096180.82992: done getting the remaining hosts for this loop 13355 1727096180.82996: getting the next task for host managed_node3 13355 1727096180.83002: done getting next task for host managed_node3 13355 1727096180.83006: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096180.83009: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096180.83021: getting variables 13355 1727096180.83022: in VariableManager get_vars() 13355 1727096180.83080: Calling all_inventory to load vars for managed_node3 13355 1727096180.83083: Calling groups_inventory to load vars for managed_node3 13355 1727096180.83086: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096180.83095: Calling all_plugins_play to load vars for managed_node3 13355 1727096180.83098: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096180.83101: Calling groups_plugins_play to load vars for managed_node3 13355 1727096180.84430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096180.85309: done with get_vars() 13355 1727096180.85331: done getting variables 13355 1727096180.85382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:56:20 -0400 (0:00:00.049) 0:00:30.114 ****** 13355 1727096180.85414: entering _queue_task() for managed_node3/debug 13355 1727096180.85682: worker is 1 (out of 1 available) 13355 1727096180.85695: exiting _queue_task() for managed_node3/debug 13355 1727096180.85709: done queuing things up, now waiting for results queue to drain 13355 1727096180.85710: waiting for pending results... 13355 1727096180.85898: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096180.85991: in run() - task 0afff68d-5257-c514-593f-00000000008f 13355 1727096180.86002: variable 'ansible_search_path' from source: unknown 13355 1727096180.86006: variable 'ansible_search_path' from source: unknown 13355 1727096180.86035: calling self._execute() 13355 1727096180.86274: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.86278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.86280: variable 'omit' from source: magic vars 13355 1727096180.86604: variable 'ansible_distribution_major_version' from source: facts 13355 1727096180.86622: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096180.86749: variable 'network_state' from source: role '' defaults 13355 1727096180.86769: Evaluated conditional (network_state != {}): False 13355 1727096180.86777: when evaluation is False, skipping this task 13355 1727096180.86783: _execute() done 13355 1727096180.86790: dumping result to json 13355 1727096180.86796: done dumping result, returning 13355 1727096180.86808: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-c514-593f-00000000008f] 13355 1727096180.86817: sending task result for task 0afff68d-5257-c514-593f-00000000008f skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13355 1727096180.86973: no more pending results, returning what we have 13355 1727096180.86977: results queue empty 13355 1727096180.87070: checking for any_errors_fatal 13355 1727096180.87081: done checking for any_errors_fatal 13355 1727096180.87081: checking for max_fail_percentage 13355 1727096180.87083: done checking for max_fail_percentage 13355 1727096180.87084: checking to see if all hosts have failed and the running result is not ok 13355 1727096180.87085: done checking to see if all hosts have failed 13355 1727096180.87087: getting the remaining hosts for this loop 13355 1727096180.87088: done getting the remaining hosts for this loop 13355 1727096180.87091: getting the next task for host managed_node3 13355 1727096180.87099: done getting next task for host managed_node3 13355 1727096180.87103: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096180.87106: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096180.87124: getting variables 13355 1727096180.87126: in VariableManager get_vars() 13355 1727096180.87290: Calling all_inventory to load vars for managed_node3 13355 1727096180.87293: Calling groups_inventory to load vars for managed_node3 13355 1727096180.87296: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096180.87306: Calling all_plugins_play to load vars for managed_node3 13355 1727096180.87309: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096180.87312: Calling groups_plugins_play to load vars for managed_node3 13355 1727096180.87832: done sending task result for task 0afff68d-5257-c514-593f-00000000008f 13355 1727096180.87836: WORKER PROCESS EXITING 13355 1727096180.88995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096180.90537: done with get_vars() 13355 1727096180.90569: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:56:20 -0400 (0:00:00.052) 0:00:30.167 ****** 13355 1727096180.90702: entering _queue_task() for managed_node3/ping 13355 1727096180.91106: worker is 1 (out of 1 available) 13355 1727096180.91118: exiting _queue_task() for managed_node3/ping 13355 1727096180.91131: done queuing things up, now waiting for results queue to drain 13355 1727096180.91132: waiting for pending results... 13355 1727096180.91585: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096180.91590: in run() - task 0afff68d-5257-c514-593f-000000000090 13355 1727096180.91601: variable 'ansible_search_path' from source: unknown 13355 1727096180.91608: variable 'ansible_search_path' from source: unknown 13355 1727096180.91648: calling self._execute() 13355 1727096180.91758: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.91773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.91788: variable 'omit' from source: magic vars 13355 1727096180.92192: variable 'ansible_distribution_major_version' from source: facts 13355 1727096180.92212: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096180.92224: variable 'omit' from source: magic vars 13355 1727096180.92295: variable 'omit' from source: magic vars 13355 1727096180.92348: variable 'omit' from source: magic vars 13355 1727096180.92458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096180.92508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096180.92534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096180.92574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096180.92577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096180.92614: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096180.92682: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.92685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.92744: Set connection var ansible_shell_executable to /bin/sh 13355 1727096180.92755: Set connection var ansible_shell_type to sh 13355 1727096180.92769: Set connection var ansible_pipelining to False 13355 1727096180.92778: Set connection var ansible_connection to ssh 13355 1727096180.92792: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096180.92800: Set connection var ansible_timeout to 10 13355 1727096180.92823: variable 'ansible_shell_executable' from source: unknown 13355 1727096180.92829: variable 'ansible_connection' from source: unknown 13355 1727096180.92834: variable 'ansible_module_compression' from source: unknown 13355 1727096180.92839: variable 'ansible_shell_type' from source: unknown 13355 1727096180.92843: variable 'ansible_shell_executable' from source: unknown 13355 1727096180.92848: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096180.92853: variable 'ansible_pipelining' from source: unknown 13355 1727096180.92860: variable 'ansible_timeout' from source: unknown 13355 1727096180.92866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096180.93114: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096180.93118: variable 'omit' from source: magic vars 13355 1727096180.93120: starting attempt loop 13355 1727096180.93122: running the handler 13355 1727096180.93124: _low_level_execute_command(): starting 13355 1727096180.93128: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096180.94169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096180.94220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.94322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096180.94346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096180.94432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096180.96125: stdout chunk (state=3): >>>/root <<< 13355 1727096180.96306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096180.96310: stderr chunk (state=3): >>><<< 13355 1727096180.96313: stdout chunk (state=3): >>><<< 13355 1727096180.96346: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096180.96359: _low_level_execute_command(): starting 13355 1727096180.96363: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099 `" && echo ansible-tmp-1727096180.9634473-14675-48854776228099="` echo /root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099 `" ) && sleep 0' 13355 1727096180.96834: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096180.96838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.96841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096180.96852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096180.96854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096180.96933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096180.96970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096180.99091: stdout chunk (state=3): >>>ansible-tmp-1727096180.9634473-14675-48854776228099=/root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099 <<< 13355 1727096180.99372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096180.99376: stdout chunk (state=3): >>><<< 13355 1727096180.99379: stderr chunk (state=3): >>><<< 13355 1727096180.99398: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096180.9634473-14675-48854776228099=/root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096180.99461: variable 'ansible_module_compression' from source: unknown 13355 1727096180.99542: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13355 1727096180.99565: variable 'ansible_facts' from source: unknown 13355 1727096180.99651: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/AnsiballZ_ping.py 13355 1727096180.99916: Sending initial data 13355 1727096180.99928: Sent initial data (152 bytes) 13355 1727096181.00520: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096181.00533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.00587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.00648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.00673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.00706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.00773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.02572: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096181.02578: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096181.02580: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmptpcgn2q4" to remote "/root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/AnsiballZ_ping.py" <<< 13355 1727096181.02799: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmptpcgn2q4 /root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/AnsiballZ_ping.py <<< 13355 1727096181.03471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.03534: stderr chunk (state=3): >>><<< 13355 1727096181.03552: stdout chunk (state=3): >>><<< 13355 1727096181.03628: done transferring module to remote 13355 1727096181.03645: _low_level_execute_command(): starting 13355 1727096181.03671: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/ /root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/AnsiballZ_ping.py && sleep 0' 13355 1727096181.04344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096181.04364: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096181.04390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096181.04409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096181.04434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096181.04543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.04576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.04654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.06607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.06871: stderr chunk (state=3): >>><<< 13355 1727096181.06876: stdout chunk (state=3): >>><<< 13355 1727096181.06879: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.06881: _low_level_execute_command(): starting 13355 1727096181.06884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/AnsiballZ_ping.py && sleep 0' 13355 1727096181.08610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096181.08615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.08636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.08731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.08750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.08911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.24677: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13355 1727096181.26098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096181.26126: stderr chunk (state=3): >>><<< 13355 1727096181.26130: stdout chunk (state=3): >>><<< 13355 1727096181.26149: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096181.26174: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096181.26182: _low_level_execute_command(): starting 13355 1727096181.26187: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096180.9634473-14675-48854776228099/ > /dev/null 2>&1 && sleep 0' 13355 1727096181.26650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096181.26654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.26656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096181.26658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096181.26661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.26720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.26728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.26730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.26765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.28651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.28682: stderr chunk (state=3): >>><<< 13355 1727096181.28685: stdout chunk (state=3): >>><<< 13355 1727096181.28699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.28707: handler run complete 13355 1727096181.28722: attempt loop complete, returning result 13355 1727096181.28724: _execute() done 13355 1727096181.28727: dumping result to json 13355 1727096181.28729: done dumping result, returning 13355 1727096181.28736: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-c514-593f-000000000090] 13355 1727096181.28741: sending task result for task 0afff68d-5257-c514-593f-000000000090 13355 1727096181.28832: done sending task result for task 0afff68d-5257-c514-593f-000000000090 13355 1727096181.28835: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 13355 1727096181.28899: no more pending results, returning what we have 13355 1727096181.28903: results queue empty 13355 1727096181.28903: checking for any_errors_fatal 13355 1727096181.28909: done checking for any_errors_fatal 13355 1727096181.28910: checking for max_fail_percentage 13355 1727096181.28912: done checking for max_fail_percentage 13355 1727096181.28912: checking to see if all hosts have failed and the running result is not ok 13355 1727096181.28913: done checking to see if all hosts have failed 13355 1727096181.28914: getting the remaining hosts for this loop 13355 1727096181.28915: done getting the remaining hosts for this loop 13355 1727096181.28918: getting the next task for host managed_node3 13355 1727096181.28927: done getting next task for host managed_node3 13355 1727096181.28929: ^ task is: TASK: meta (role_complete) 13355 1727096181.28932: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096181.28942: getting variables 13355 1727096181.28945: in VariableManager get_vars() 13355 1727096181.28999: Calling all_inventory to load vars for managed_node3 13355 1727096181.29001: Calling groups_inventory to load vars for managed_node3 13355 1727096181.29004: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096181.29014: Calling all_plugins_play to load vars for managed_node3 13355 1727096181.29017: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096181.29019: Calling groups_plugins_play to load vars for managed_node3 13355 1727096181.29810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096181.30688: done with get_vars() 13355 1727096181.30711: done getting variables 13355 1727096181.30775: done queuing things up, now waiting for results queue to drain 13355 1727096181.30777: results queue empty 13355 1727096181.30778: checking for any_errors_fatal 13355 1727096181.30780: done checking for any_errors_fatal 13355 1727096181.30780: checking for max_fail_percentage 13355 1727096181.30781: done checking for max_fail_percentage 13355 1727096181.30781: checking to see if all hosts have failed and the running result is not ok 13355 1727096181.30782: done checking to see if all hosts have failed 13355 1727096181.30782: getting the remaining hosts for this loop 13355 1727096181.30783: done getting the remaining hosts for this loop 13355 1727096181.30785: getting the next task for host managed_node3 13355 1727096181.30788: done getting next task for host managed_node3 13355 1727096181.30790: ^ task is: TASK: From the active connection, get the port1 profile "{{ port1_profile }}" 13355 1727096181.30791: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096181.30792: getting variables 13355 1727096181.30793: in VariableManager get_vars() 13355 1727096181.30809: Calling all_inventory to load vars for managed_node3 13355 1727096181.30810: Calling groups_inventory to load vars for managed_node3 13355 1727096181.30812: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096181.30816: Calling all_plugins_play to load vars for managed_node3 13355 1727096181.30818: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096181.30819: Calling groups_plugins_play to load vars for managed_node3 13355 1727096181.31535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096181.32391: done with get_vars() 13355 1727096181.32410: done getting variables 13355 1727096181.32445: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096181.32538: variable 'port1_profile' from source: play vars TASK [From the active connection, get the port1 profile "bond0.0"] ************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:104 Monday 23 September 2024 08:56:21 -0400 (0:00:00.418) 0:00:30.586 ****** 13355 1727096181.32561: entering _queue_task() for managed_node3/command 13355 1727096181.32877: worker is 1 (out of 1 available) 13355 1727096181.32893: exiting _queue_task() for managed_node3/command 13355 1727096181.32909: done queuing things up, now waiting for results queue to drain 13355 1727096181.32910: waiting for pending results... 13355 1727096181.33102: running TaskExecutor() for managed_node3/TASK: From the active connection, get the port1 profile "bond0.0" 13355 1727096181.33183: in run() - task 0afff68d-5257-c514-593f-0000000000c0 13355 1727096181.33196: variable 'ansible_search_path' from source: unknown 13355 1727096181.33230: calling self._execute() 13355 1727096181.33311: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096181.33315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096181.33329: variable 'omit' from source: magic vars 13355 1727096181.33617: variable 'ansible_distribution_major_version' from source: facts 13355 1727096181.33626: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096181.33710: variable 'network_provider' from source: set_fact 13355 1727096181.33714: Evaluated conditional (network_provider == "nm"): True 13355 1727096181.33721: variable 'omit' from source: magic vars 13355 1727096181.33739: variable 'omit' from source: magic vars 13355 1727096181.33811: variable 'port1_profile' from source: play vars 13355 1727096181.33826: variable 'omit' from source: magic vars 13355 1727096181.33865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096181.33898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096181.33913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096181.33926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096181.33936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096181.33963: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096181.33967: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096181.33971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096181.34042: Set connection var ansible_shell_executable to /bin/sh 13355 1727096181.34047: Set connection var ansible_shell_type to sh 13355 1727096181.34052: Set connection var ansible_pipelining to False 13355 1727096181.34057: Set connection var ansible_connection to ssh 13355 1727096181.34065: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096181.34070: Set connection var ansible_timeout to 10 13355 1727096181.34093: variable 'ansible_shell_executable' from source: unknown 13355 1727096181.34096: variable 'ansible_connection' from source: unknown 13355 1727096181.34099: variable 'ansible_module_compression' from source: unknown 13355 1727096181.34101: variable 'ansible_shell_type' from source: unknown 13355 1727096181.34103: variable 'ansible_shell_executable' from source: unknown 13355 1727096181.34105: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096181.34109: variable 'ansible_pipelining' from source: unknown 13355 1727096181.34112: variable 'ansible_timeout' from source: unknown 13355 1727096181.34115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096181.34220: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096181.34232: variable 'omit' from source: magic vars 13355 1727096181.34237: starting attempt loop 13355 1727096181.34240: running the handler 13355 1727096181.34253: _low_level_execute_command(): starting 13355 1727096181.34263: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096181.34785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096181.34790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.34842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.34845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.34848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.34894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.36581: stdout chunk (state=3): >>>/root <<< 13355 1727096181.36680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.36709: stderr chunk (state=3): >>><<< 13355 1727096181.36713: stdout chunk (state=3): >>><<< 13355 1727096181.36737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.36750: _low_level_execute_command(): starting 13355 1727096181.36758: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407 `" && echo ansible-tmp-1727096181.3673596-14705-209209552472407="` echo /root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407 `" ) && sleep 0' 13355 1727096181.37230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096181.37234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.37245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096181.37248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.37298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.37301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.37308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.37343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.39332: stdout chunk (state=3): >>>ansible-tmp-1727096181.3673596-14705-209209552472407=/root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407 <<< 13355 1727096181.39432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.39463: stderr chunk (state=3): >>><<< 13355 1727096181.39466: stdout chunk (state=3): >>><<< 13355 1727096181.39486: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096181.3673596-14705-209209552472407=/root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.39516: variable 'ansible_module_compression' from source: unknown 13355 1727096181.39558: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096181.39595: variable 'ansible_facts' from source: unknown 13355 1727096181.39647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/AnsiballZ_command.py 13355 1727096181.39759: Sending initial data 13355 1727096181.39763: Sent initial data (156 bytes) 13355 1727096181.40202: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096181.40206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.40218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.40264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.40291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.40327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.41980: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096181.42008: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096181.42037: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpe8rxxa1d /root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/AnsiballZ_command.py <<< 13355 1727096181.42041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/AnsiballZ_command.py" <<< 13355 1727096181.42073: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpe8rxxa1d" to remote "/root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/AnsiballZ_command.py" <<< 13355 1727096181.42076: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/AnsiballZ_command.py" <<< 13355 1727096181.42561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.42604: stderr chunk (state=3): >>><<< 13355 1727096181.42608: stdout chunk (state=3): >>><<< 13355 1727096181.42650: done transferring module to remote 13355 1727096181.42665: _low_level_execute_command(): starting 13355 1727096181.42671: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/ /root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/AnsiballZ_command.py && sleep 0' 13355 1727096181.43135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096181.43139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.43142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096181.43144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.43198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.43201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.43203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.43244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.45132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.45137: stdout chunk (state=3): >>><<< 13355 1727096181.45140: stderr chunk (state=3): >>><<< 13355 1727096181.45158: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.45162: _low_level_execute_command(): starting 13355 1727096181.45171: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/AnsiballZ_command.py && sleep 0' 13355 1727096181.45641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096181.45645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.45647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096181.45650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.45704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.45710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.45713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.45754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.63453: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-23 08:56:21.614174", "end": "2024-09-23 08:56:21.631330", "delta": "0:00:00.017156", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096181.65282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096181.65286: stdout chunk (state=3): >>><<< 13355 1727096181.65289: stderr chunk (state=3): >>><<< 13355 1727096181.65292: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.0"], "start": "2024-09-23 08:56:21.614174", "end": "2024-09-23 08:56:21.631330", "delta": "0:00:00.017156", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096181.65295: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096181.65298: _low_level_execute_command(): starting 13355 1727096181.65300: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096181.3673596-14705-209209552472407/ > /dev/null 2>&1 && sleep 0' 13355 1727096181.66058: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.66129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.66164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.66200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.66392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.68682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.68687: stdout chunk (state=3): >>><<< 13355 1727096181.68689: stderr chunk (state=3): >>><<< 13355 1727096181.68692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.68694: handler run complete 13355 1727096181.68696: Evaluated conditional (False): False 13355 1727096181.68698: attempt loop complete, returning result 13355 1727096181.68700: _execute() done 13355 1727096181.68702: dumping result to json 13355 1727096181.68704: done dumping result, returning 13355 1727096181.68706: done running TaskExecutor() for managed_node3/TASK: From the active connection, get the port1 profile "bond0.0" [0afff68d-5257-c514-593f-0000000000c0] 13355 1727096181.68708: sending task result for task 0afff68d-5257-c514-593f-0000000000c0 13355 1727096181.68787: done sending task result for task 0afff68d-5257-c514-593f-0000000000c0 13355 1727096181.68791: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.0" ], "delta": "0:00:00.017156", "end": "2024-09-23 08:56:21.631330", "rc": 0, "start": "2024-09-23 08:56:21.614174" } 13355 1727096181.68872: no more pending results, returning what we have 13355 1727096181.68876: results queue empty 13355 1727096181.68877: checking for any_errors_fatal 13355 1727096181.68879: done checking for any_errors_fatal 13355 1727096181.68880: checking for max_fail_percentage 13355 1727096181.68882: done checking for max_fail_percentage 13355 1727096181.68883: checking to see if all hosts have failed and the running result is not ok 13355 1727096181.68883: done checking to see if all hosts have failed 13355 1727096181.68884: getting the remaining hosts for this loop 13355 1727096181.68889: done getting the remaining hosts for this loop 13355 1727096181.68893: getting the next task for host managed_node3 13355 1727096181.68905: done getting next task for host managed_node3 13355 1727096181.68909: ^ task is: TASK: From the active connection, get the port2 profile "{{ port2_profile }}" 13355 1727096181.68911: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096181.68916: getting variables 13355 1727096181.68917: in VariableManager get_vars() 13355 1727096181.69220: Calling all_inventory to load vars for managed_node3 13355 1727096181.69224: Calling groups_inventory to load vars for managed_node3 13355 1727096181.69229: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096181.69241: Calling all_plugins_play to load vars for managed_node3 13355 1727096181.69244: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096181.69247: Calling groups_plugins_play to load vars for managed_node3 13355 1727096181.71117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096181.72893: done with get_vars() 13355 1727096181.72919: done getting variables 13355 1727096181.73001: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096181.73123: variable 'port2_profile' from source: play vars TASK [From the active connection, get the port2 profile "bond0.1"] ************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:111 Monday 23 September 2024 08:56:21 -0400 (0:00:00.405) 0:00:30.992 ****** 13355 1727096181.73152: entering _queue_task() for managed_node3/command 13355 1727096181.73522: worker is 1 (out of 1 available) 13355 1727096181.73534: exiting _queue_task() for managed_node3/command 13355 1727096181.73548: done queuing things up, now waiting for results queue to drain 13355 1727096181.73549: waiting for pending results... 13355 1727096181.73846: running TaskExecutor() for managed_node3/TASK: From the active connection, get the port2 profile "bond0.1" 13355 1727096181.73996: in run() - task 0afff68d-5257-c514-593f-0000000000c1 13355 1727096181.73999: variable 'ansible_search_path' from source: unknown 13355 1727096181.74173: calling self._execute() 13355 1727096181.74177: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096181.74180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096181.74183: variable 'omit' from source: magic vars 13355 1727096181.74572: variable 'ansible_distribution_major_version' from source: facts 13355 1727096181.74589: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096181.74716: variable 'network_provider' from source: set_fact 13355 1727096181.74740: Evaluated conditional (network_provider == "nm"): True 13355 1727096181.74751: variable 'omit' from source: magic vars 13355 1727096181.74779: variable 'omit' from source: magic vars 13355 1727096181.74886: variable 'port2_profile' from source: play vars 13355 1727096181.74912: variable 'omit' from source: magic vars 13355 1727096181.74970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096181.75012: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096181.75041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096181.75072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096181.75089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096181.75124: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096181.75133: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096181.75163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096181.75253: Set connection var ansible_shell_executable to /bin/sh 13355 1727096181.75273: Set connection var ansible_shell_type to sh 13355 1727096181.75284: Set connection var ansible_pipelining to False 13355 1727096181.75372: Set connection var ansible_connection to ssh 13355 1727096181.75377: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096181.75379: Set connection var ansible_timeout to 10 13355 1727096181.75382: variable 'ansible_shell_executable' from source: unknown 13355 1727096181.75384: variable 'ansible_connection' from source: unknown 13355 1727096181.75385: variable 'ansible_module_compression' from source: unknown 13355 1727096181.75387: variable 'ansible_shell_type' from source: unknown 13355 1727096181.75389: variable 'ansible_shell_executable' from source: unknown 13355 1727096181.75391: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096181.75393: variable 'ansible_pipelining' from source: unknown 13355 1727096181.75394: variable 'ansible_timeout' from source: unknown 13355 1727096181.75397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096181.75573: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096181.75576: variable 'omit' from source: magic vars 13355 1727096181.75578: starting attempt loop 13355 1727096181.75580: running the handler 13355 1727096181.75582: _low_level_execute_command(): starting 13355 1727096181.75589: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096181.76364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096181.76386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096181.76407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096181.76516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.76792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.76849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.78560: stdout chunk (state=3): >>>/root <<< 13355 1727096181.78807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.78812: stdout chunk (state=3): >>><<< 13355 1727096181.78821: stderr chunk (state=3): >>><<< 13355 1727096181.78847: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.78868: _low_level_execute_command(): starting 13355 1727096181.78875: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582 `" && echo ansible-tmp-1727096181.7885125-14715-270886353141582="` echo /root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582 `" ) && sleep 0' 13355 1727096181.80259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096181.80294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.80378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.80401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.80485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.82688: stdout chunk (state=3): >>>ansible-tmp-1727096181.7885125-14715-270886353141582=/root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582 <<< 13355 1727096181.82828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.82880: stderr chunk (state=3): >>><<< 13355 1727096181.82883: stdout chunk (state=3): >>><<< 13355 1727096181.82906: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096181.7885125-14715-270886353141582=/root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.83111: variable 'ansible_module_compression' from source: unknown 13355 1727096181.83114: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096181.83230: variable 'ansible_facts' from source: unknown 13355 1727096181.83437: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/AnsiballZ_command.py 13355 1727096181.83678: Sending initial data 13355 1727096181.83687: Sent initial data (156 bytes) 13355 1727096181.84835: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096181.84973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.85114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.85125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.85150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.85300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.86963: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096181.86988: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096181.87075: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp9lnoi8vn /root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/AnsiballZ_command.py <<< 13355 1727096181.87145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp9lnoi8vn" to remote "/root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/AnsiballZ_command.py" <<< 13355 1727096181.88910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.88965: stderr chunk (state=3): >>><<< 13355 1727096181.89003: stdout chunk (state=3): >>><<< 13355 1727096181.89174: done transferring module to remote 13355 1727096181.89177: _low_level_execute_command(): starting 13355 1727096181.89180: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/ /root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/AnsiballZ_command.py && sleep 0' 13355 1727096181.90252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096181.90581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096181.90589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096181.90604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096181.90670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096181.92589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096181.92593: stdout chunk (state=3): >>><<< 13355 1727096181.92598: stderr chunk (state=3): >>><<< 13355 1727096181.92618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096181.92626: _low_level_execute_command(): starting 13355 1727096181.92631: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/AnsiballZ_command.py && sleep 0' 13355 1727096181.93840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096181.93844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13355 1727096181.93847: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096181.93850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096181.94181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096182.12112: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-23 08:56:22.097287", "end": "2024-09-23 08:56:22.114642", "delta": "0:00:00.017355", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096182.13562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096182.13566: stdout chunk (state=3): >>><<< 13355 1727096182.13570: stderr chunk (state=3): >>><<< 13355 1727096182.13589: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0.1"], "start": "2024-09-23 08:56:22.097287", "end": "2024-09-23 08:56:22.114642", "delta": "0:00:00.017355", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0.1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096182.13632: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096182.13954: _low_level_execute_command(): starting 13355 1727096182.13957: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096181.7885125-14715-270886353141582/ > /dev/null 2>&1 && sleep 0' 13355 1727096182.15376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096182.15394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096182.15397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096182.15702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096182.15815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096182.15858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096182.17795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096182.17981: stderr chunk (state=3): >>><<< 13355 1727096182.18381: stdout chunk (state=3): >>><<< 13355 1727096182.18385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096182.18392: handler run complete 13355 1727096182.18394: Evaluated conditional (False): False 13355 1727096182.18396: attempt loop complete, returning result 13355 1727096182.18398: _execute() done 13355 1727096182.18400: dumping result to json 13355 1727096182.18402: done dumping result, returning 13355 1727096182.18408: done running TaskExecutor() for managed_node3/TASK: From the active connection, get the port2 profile "bond0.1" [0afff68d-5257-c514-593f-0000000000c1] 13355 1727096182.18410: sending task result for task 0afff68d-5257-c514-593f-0000000000c1 13355 1727096182.18503: done sending task result for task 0afff68d-5257-c514-593f-0000000000c1 13355 1727096182.18507: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0.1" ], "delta": "0:00:00.017355", "end": "2024-09-23 08:56:22.114642", "rc": 0, "start": "2024-09-23 08:56:22.097287" } 13355 1727096182.18595: no more pending results, returning what we have 13355 1727096182.18600: results queue empty 13355 1727096182.18601: checking for any_errors_fatal 13355 1727096182.18613: done checking for any_errors_fatal 13355 1727096182.18614: checking for max_fail_percentage 13355 1727096182.18616: done checking for max_fail_percentage 13355 1727096182.18617: checking to see if all hosts have failed and the running result is not ok 13355 1727096182.18618: done checking to see if all hosts have failed 13355 1727096182.18619: getting the remaining hosts for this loop 13355 1727096182.18620: done getting the remaining hosts for this loop 13355 1727096182.18624: getting the next task for host managed_node3 13355 1727096182.18630: done getting next task for host managed_node3 13355 1727096182.18635: ^ task is: TASK: Assert that the port1 profile is not activated 13355 1727096182.18637: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096182.18642: getting variables 13355 1727096182.18644: in VariableManager get_vars() 13355 1727096182.19002: Calling all_inventory to load vars for managed_node3 13355 1727096182.19005: Calling groups_inventory to load vars for managed_node3 13355 1727096182.19008: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.19019: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.19022: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.19026: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.22558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096182.24475: done with get_vars() 13355 1727096182.24505: done getting variables 13355 1727096182.24579: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port1 profile is not activated] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:118 Monday 23 September 2024 08:56:22 -0400 (0:00:00.514) 0:00:31.506 ****** 13355 1727096182.24614: entering _queue_task() for managed_node3/assert 13355 1727096182.25533: worker is 1 (out of 1 available) 13355 1727096182.25545: exiting _queue_task() for managed_node3/assert 13355 1727096182.25560: done queuing things up, now waiting for results queue to drain 13355 1727096182.25562: waiting for pending results... 13355 1727096182.26019: running TaskExecutor() for managed_node3/TASK: Assert that the port1 profile is not activated 13355 1727096182.26331: in run() - task 0afff68d-5257-c514-593f-0000000000c2 13355 1727096182.26336: variable 'ansible_search_path' from source: unknown 13355 1727096182.26388: calling self._execute() 13355 1727096182.26657: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.26671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.26689: variable 'omit' from source: magic vars 13355 1727096182.27376: variable 'ansible_distribution_major_version' from source: facts 13355 1727096182.27594: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096182.27832: variable 'network_provider' from source: set_fact 13355 1727096182.27843: Evaluated conditional (network_provider == "nm"): True 13355 1727096182.27853: variable 'omit' from source: magic vars 13355 1727096182.27881: variable 'omit' from source: magic vars 13355 1727096182.28097: variable 'port1_profile' from source: play vars 13355 1727096182.28121: variable 'omit' from source: magic vars 13355 1727096182.28383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096182.28386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096182.28388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096182.28390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096182.28393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096182.28509: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096182.28520: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.28529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.28772: Set connection var ansible_shell_executable to /bin/sh 13355 1727096182.28776: Set connection var ansible_shell_type to sh 13355 1727096182.28779: Set connection var ansible_pipelining to False 13355 1727096182.28781: Set connection var ansible_connection to ssh 13355 1727096182.28784: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096182.28786: Set connection var ansible_timeout to 10 13355 1727096182.28991: variable 'ansible_shell_executable' from source: unknown 13355 1727096182.28999: variable 'ansible_connection' from source: unknown 13355 1727096182.29074: variable 'ansible_module_compression' from source: unknown 13355 1727096182.29077: variable 'ansible_shell_type' from source: unknown 13355 1727096182.29080: variable 'ansible_shell_executable' from source: unknown 13355 1727096182.29083: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.29086: variable 'ansible_pipelining' from source: unknown 13355 1727096182.29088: variable 'ansible_timeout' from source: unknown 13355 1727096182.29092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.29310: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096182.29332: variable 'omit' from source: magic vars 13355 1727096182.29343: starting attempt loop 13355 1727096182.29351: running the handler 13355 1727096182.29614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096182.34674: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096182.34680: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096182.34683: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096182.34685: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096182.34687: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096182.34872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096182.35004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096182.35041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096182.35173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096182.35195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096182.35462: variable 'active_port1_profile' from source: set_fact 13355 1727096182.35490: Evaluated conditional (active_port1_profile.stdout | length == 0): True 13355 1727096182.35501: handler run complete 13355 1727096182.35583: attempt loop complete, returning result 13355 1727096182.35591: _execute() done 13355 1727096182.35599: dumping result to json 13355 1727096182.35606: done dumping result, returning 13355 1727096182.35620: done running TaskExecutor() for managed_node3/TASK: Assert that the port1 profile is not activated [0afff68d-5257-c514-593f-0000000000c2] 13355 1727096182.35631: sending task result for task 0afff68d-5257-c514-593f-0000000000c2 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096182.35787: no more pending results, returning what we have 13355 1727096182.35791: results queue empty 13355 1727096182.35792: checking for any_errors_fatal 13355 1727096182.35802: done checking for any_errors_fatal 13355 1727096182.35803: checking for max_fail_percentage 13355 1727096182.35805: done checking for max_fail_percentage 13355 1727096182.35805: checking to see if all hosts have failed and the running result is not ok 13355 1727096182.35806: done checking to see if all hosts have failed 13355 1727096182.35807: getting the remaining hosts for this loop 13355 1727096182.35808: done getting the remaining hosts for this loop 13355 1727096182.35812: getting the next task for host managed_node3 13355 1727096182.35819: done getting next task for host managed_node3 13355 1727096182.35822: ^ task is: TASK: Assert that the port2 profile is not activated 13355 1727096182.35824: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096182.35829: getting variables 13355 1727096182.35830: in VariableManager get_vars() 13355 1727096182.35889: Calling all_inventory to load vars for managed_node3 13355 1727096182.35900: Calling groups_inventory to load vars for managed_node3 13355 1727096182.35903: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.35916: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.35919: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.35922: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.37087: done sending task result for task 0afff68d-5257-c514-593f-0000000000c2 13355 1727096182.37090: WORKER PROCESS EXITING 13355 1727096182.37915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096182.39511: done with get_vars() 13355 1727096182.39540: done getting variables 13355 1727096182.39599: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port2 profile is not activated] ************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:125 Monday 23 September 2024 08:56:22 -0400 (0:00:00.150) 0:00:31.656 ****** 13355 1727096182.39627: entering _queue_task() for managed_node3/assert 13355 1727096182.40003: worker is 1 (out of 1 available) 13355 1727096182.40017: exiting _queue_task() for managed_node3/assert 13355 1727096182.40032: done queuing things up, now waiting for results queue to drain 13355 1727096182.40034: waiting for pending results... 13355 1727096182.40320: running TaskExecutor() for managed_node3/TASK: Assert that the port2 profile is not activated 13355 1727096182.40427: in run() - task 0afff68d-5257-c514-593f-0000000000c3 13355 1727096182.40449: variable 'ansible_search_path' from source: unknown 13355 1727096182.40496: calling self._execute() 13355 1727096182.40594: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.40607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.40622: variable 'omit' from source: magic vars 13355 1727096182.41005: variable 'ansible_distribution_major_version' from source: facts 13355 1727096182.41022: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096182.41132: variable 'network_provider' from source: set_fact 13355 1727096182.41147: Evaluated conditional (network_provider == "nm"): True 13355 1727096182.41158: variable 'omit' from source: magic vars 13355 1727096182.41184: variable 'omit' from source: magic vars 13355 1727096182.41287: variable 'port2_profile' from source: play vars 13355 1727096182.41311: variable 'omit' from source: magic vars 13355 1727096182.41359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096182.41406: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096182.41432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096182.41456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096182.41478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096182.41514: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096182.41522: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.41529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.41632: Set connection var ansible_shell_executable to /bin/sh 13355 1727096182.41644: Set connection var ansible_shell_type to sh 13355 1727096182.41654: Set connection var ansible_pipelining to False 13355 1727096182.41664: Set connection var ansible_connection to ssh 13355 1727096182.41676: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096182.41690: Set connection var ansible_timeout to 10 13355 1727096182.41719: variable 'ansible_shell_executable' from source: unknown 13355 1727096182.41727: variable 'ansible_connection' from source: unknown 13355 1727096182.41734: variable 'ansible_module_compression' from source: unknown 13355 1727096182.41745: variable 'ansible_shell_type' from source: unknown 13355 1727096182.41751: variable 'ansible_shell_executable' from source: unknown 13355 1727096182.41799: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.41802: variable 'ansible_pipelining' from source: unknown 13355 1727096182.41805: variable 'ansible_timeout' from source: unknown 13355 1727096182.41808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.41932: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096182.41951: variable 'omit' from source: magic vars 13355 1727096182.41962: starting attempt loop 13355 1727096182.41972: running the handler 13355 1727096182.42156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096182.44302: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096182.44362: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096182.44411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096182.44472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096182.44475: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096182.44548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096182.44583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096182.44612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096182.44734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096182.44738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096182.44790: variable 'active_port2_profile' from source: set_fact 13355 1727096182.44817: Evaluated conditional (active_port2_profile.stdout | length == 0): True 13355 1727096182.44829: handler run complete 13355 1727096182.44856: attempt loop complete, returning result 13355 1727096182.44865: _execute() done 13355 1727096182.44875: dumping result to json 13355 1727096182.44882: done dumping result, returning 13355 1727096182.44895: done running TaskExecutor() for managed_node3/TASK: Assert that the port2 profile is not activated [0afff68d-5257-c514-593f-0000000000c3] 13355 1727096182.44904: sending task result for task 0afff68d-5257-c514-593f-0000000000c3 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096182.45056: no more pending results, returning what we have 13355 1727096182.45060: results queue empty 13355 1727096182.45061: checking for any_errors_fatal 13355 1727096182.45070: done checking for any_errors_fatal 13355 1727096182.45071: checking for max_fail_percentage 13355 1727096182.45074: done checking for max_fail_percentage 13355 1727096182.45075: checking to see if all hosts have failed and the running result is not ok 13355 1727096182.45076: done checking to see if all hosts have failed 13355 1727096182.45076: getting the remaining hosts for this loop 13355 1727096182.45078: done getting the remaining hosts for this loop 13355 1727096182.45081: getting the next task for host managed_node3 13355 1727096182.45088: done getting next task for host managed_node3 13355 1727096182.45091: ^ task is: TASK: Get the port1 device state 13355 1727096182.45093: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096182.45098: getting variables 13355 1727096182.45099: in VariableManager get_vars() 13355 1727096182.45158: Calling all_inventory to load vars for managed_node3 13355 1727096182.45161: Calling groups_inventory to load vars for managed_node3 13355 1727096182.45475: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.45485: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.45489: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.45492: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.46182: done sending task result for task 0afff68d-5257-c514-593f-0000000000c3 13355 1727096182.46186: WORKER PROCESS EXITING 13355 1727096182.46927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096182.48490: done with get_vars() 13355 1727096182.48520: done getting variables 13355 1727096182.48583: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the port1 device state] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:132 Monday 23 September 2024 08:56:22 -0400 (0:00:00.089) 0:00:31.746 ****** 13355 1727096182.48613: entering _queue_task() for managed_node3/command 13355 1727096182.48997: worker is 1 (out of 1 available) 13355 1727096182.49013: exiting _queue_task() for managed_node3/command 13355 1727096182.49027: done queuing things up, now waiting for results queue to drain 13355 1727096182.49029: waiting for pending results... 13355 1727096182.49315: running TaskExecutor() for managed_node3/TASK: Get the port1 device state 13355 1727096182.49438: in run() - task 0afff68d-5257-c514-593f-0000000000c4 13355 1727096182.49463: variable 'ansible_search_path' from source: unknown 13355 1727096182.49514: calling self._execute() 13355 1727096182.49628: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.49641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.49653: variable 'omit' from source: magic vars 13355 1727096182.50063: variable 'ansible_distribution_major_version' from source: facts 13355 1727096182.50083: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096182.50200: variable 'network_provider' from source: set_fact 13355 1727096182.50211: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096182.50219: when evaluation is False, skipping this task 13355 1727096182.50225: _execute() done 13355 1727096182.50233: dumping result to json 13355 1727096182.50241: done dumping result, returning 13355 1727096182.50255: done running TaskExecutor() for managed_node3/TASK: Get the port1 device state [0afff68d-5257-c514-593f-0000000000c4] 13355 1727096182.50266: sending task result for task 0afff68d-5257-c514-593f-0000000000c4 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096182.50436: no more pending results, returning what we have 13355 1727096182.50441: results queue empty 13355 1727096182.50441: checking for any_errors_fatal 13355 1727096182.50448: done checking for any_errors_fatal 13355 1727096182.50449: checking for max_fail_percentage 13355 1727096182.50451: done checking for max_fail_percentage 13355 1727096182.50452: checking to see if all hosts have failed and the running result is not ok 13355 1727096182.50453: done checking to see if all hosts have failed 13355 1727096182.50453: getting the remaining hosts for this loop 13355 1727096182.50455: done getting the remaining hosts for this loop 13355 1727096182.50458: getting the next task for host managed_node3 13355 1727096182.50465: done getting next task for host managed_node3 13355 1727096182.50469: ^ task is: TASK: Get the port2 device state 13355 1727096182.50473: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096182.50477: getting variables 13355 1727096182.50478: in VariableManager get_vars() 13355 1727096182.50535: Calling all_inventory to load vars for managed_node3 13355 1727096182.50538: Calling groups_inventory to load vars for managed_node3 13355 1727096182.50540: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.50554: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.50556: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.50559: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.51182: done sending task result for task 0afff68d-5257-c514-593f-0000000000c4 13355 1727096182.51186: WORKER PROCESS EXITING 13355 1727096182.52335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096182.53871: done with get_vars() 13355 1727096182.53901: done getting variables 13355 1727096182.53962: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the port2 device state] ********************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:139 Monday 23 September 2024 08:56:22 -0400 (0:00:00.053) 0:00:31.800 ****** 13355 1727096182.53994: entering _queue_task() for managed_node3/command 13355 1727096182.54337: worker is 1 (out of 1 available) 13355 1727096182.54350: exiting _queue_task() for managed_node3/command 13355 1727096182.54362: done queuing things up, now waiting for results queue to drain 13355 1727096182.54363: waiting for pending results... 13355 1727096182.54903: running TaskExecutor() for managed_node3/TASK: Get the port2 device state 13355 1727096182.55017: in run() - task 0afff68d-5257-c514-593f-0000000000c5 13355 1727096182.55215: variable 'ansible_search_path' from source: unknown 13355 1727096182.55220: calling self._execute() 13355 1727096182.55387: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.55444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.55460: variable 'omit' from source: magic vars 13355 1727096182.56394: variable 'ansible_distribution_major_version' from source: facts 13355 1727096182.56398: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096182.56523: variable 'network_provider' from source: set_fact 13355 1727096182.56619: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096182.56628: when evaluation is False, skipping this task 13355 1727096182.56638: _execute() done 13355 1727096182.56645: dumping result to json 13355 1727096182.56652: done dumping result, returning 13355 1727096182.56664: done running TaskExecutor() for managed_node3/TASK: Get the port2 device state [0afff68d-5257-c514-593f-0000000000c5] 13355 1727096182.56677: sending task result for task 0afff68d-5257-c514-593f-0000000000c5 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096182.56839: no more pending results, returning what we have 13355 1727096182.56844: results queue empty 13355 1727096182.56845: checking for any_errors_fatal 13355 1727096182.56855: done checking for any_errors_fatal 13355 1727096182.56855: checking for max_fail_percentage 13355 1727096182.56858: done checking for max_fail_percentage 13355 1727096182.56859: checking to see if all hosts have failed and the running result is not ok 13355 1727096182.56860: done checking to see if all hosts have failed 13355 1727096182.56860: getting the remaining hosts for this loop 13355 1727096182.56862: done getting the remaining hosts for this loop 13355 1727096182.56866: getting the next task for host managed_node3 13355 1727096182.56875: done getting next task for host managed_node3 13355 1727096182.56878: ^ task is: TASK: Assert that the port1 device is in DOWN state 13355 1727096182.56881: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096182.56886: getting variables 13355 1727096182.56887: in VariableManager get_vars() 13355 1727096182.56952: Calling all_inventory to load vars for managed_node3 13355 1727096182.56955: Calling groups_inventory to load vars for managed_node3 13355 1727096182.56957: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.57174: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.57179: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.57183: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.58476: done sending task result for task 0afff68d-5257-c514-593f-0000000000c5 13355 1727096182.58480: WORKER PROCESS EXITING 13355 1727096182.60007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096182.63300: done with get_vars() 13355 1727096182.63332: done getting variables 13355 1727096182.63391: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port1 device is in DOWN state] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:146 Monday 23 September 2024 08:56:22 -0400 (0:00:00.094) 0:00:31.894 ****** 13355 1727096182.63418: entering _queue_task() for managed_node3/assert 13355 1727096182.64203: worker is 1 (out of 1 available) 13355 1727096182.64217: exiting _queue_task() for managed_node3/assert 13355 1727096182.64234: done queuing things up, now waiting for results queue to drain 13355 1727096182.64236: waiting for pending results... 13355 1727096182.64837: running TaskExecutor() for managed_node3/TASK: Assert that the port1 device is in DOWN state 13355 1727096182.65061: in run() - task 0afff68d-5257-c514-593f-0000000000c6 13355 1727096182.65139: variable 'ansible_search_path' from source: unknown 13355 1727096182.65237: calling self._execute() 13355 1727096182.65569: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.65573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.65585: variable 'omit' from source: magic vars 13355 1727096182.66432: variable 'ansible_distribution_major_version' from source: facts 13355 1727096182.66443: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096182.66781: variable 'network_provider' from source: set_fact 13355 1727096182.66791: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096182.66794: when evaluation is False, skipping this task 13355 1727096182.66797: _execute() done 13355 1727096182.66800: dumping result to json 13355 1727096182.66802: done dumping result, returning 13355 1727096182.66811: done running TaskExecutor() for managed_node3/TASK: Assert that the port1 device is in DOWN state [0afff68d-5257-c514-593f-0000000000c6] 13355 1727096182.66817: sending task result for task 0afff68d-5257-c514-593f-0000000000c6 13355 1727096182.66927: done sending task result for task 0afff68d-5257-c514-593f-0000000000c6 13355 1727096182.66930: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096182.66985: no more pending results, returning what we have 13355 1727096182.66990: results queue empty 13355 1727096182.66991: checking for any_errors_fatal 13355 1727096182.66998: done checking for any_errors_fatal 13355 1727096182.66999: checking for max_fail_percentage 13355 1727096182.67000: done checking for max_fail_percentage 13355 1727096182.67001: checking to see if all hosts have failed and the running result is not ok 13355 1727096182.67002: done checking to see if all hosts have failed 13355 1727096182.67002: getting the remaining hosts for this loop 13355 1727096182.67004: done getting the remaining hosts for this loop 13355 1727096182.67007: getting the next task for host managed_node3 13355 1727096182.67013: done getting next task for host managed_node3 13355 1727096182.67015: ^ task is: TASK: Assert that the port2 device is in DOWN state 13355 1727096182.67018: ^ state is: HOST STATE: block=2, task=26, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096182.67021: getting variables 13355 1727096182.67023: in VariableManager get_vars() 13355 1727096182.67083: Calling all_inventory to load vars for managed_node3 13355 1727096182.67086: Calling groups_inventory to load vars for managed_node3 13355 1727096182.67090: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.67103: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.67106: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.67109: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.70037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096182.73216: done with get_vars() 13355 1727096182.73247: done getting variables 13355 1727096182.73513: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the port2 device is in DOWN state] *************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:153 Monday 23 September 2024 08:56:22 -0400 (0:00:00.101) 0:00:31.996 ****** 13355 1727096182.73543: entering _queue_task() for managed_node3/assert 13355 1727096182.74363: worker is 1 (out of 1 available) 13355 1727096182.74381: exiting _queue_task() for managed_node3/assert 13355 1727096182.74397: done queuing things up, now waiting for results queue to drain 13355 1727096182.74398: waiting for pending results... 13355 1727096182.74873: running TaskExecutor() for managed_node3/TASK: Assert that the port2 device is in DOWN state 13355 1727096182.75376: in run() - task 0afff68d-5257-c514-593f-0000000000c7 13355 1727096182.75381: variable 'ansible_search_path' from source: unknown 13355 1727096182.75384: calling self._execute() 13355 1727096182.75564: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.75570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.75583: variable 'omit' from source: magic vars 13355 1727096182.76575: variable 'ansible_distribution_major_version' from source: facts 13355 1727096182.76587: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096182.76918: variable 'network_provider' from source: set_fact 13355 1727096182.76922: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096182.76924: when evaluation is False, skipping this task 13355 1727096182.76928: _execute() done 13355 1727096182.76930: dumping result to json 13355 1727096182.76972: done dumping result, returning 13355 1727096182.76976: done running TaskExecutor() for managed_node3/TASK: Assert that the port2 device is in DOWN state [0afff68d-5257-c514-593f-0000000000c7] 13355 1727096182.76979: sending task result for task 0afff68d-5257-c514-593f-0000000000c7 13355 1727096182.77273: done sending task result for task 0afff68d-5257-c514-593f-0000000000c7 13355 1727096182.77277: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096182.77322: no more pending results, returning what we have 13355 1727096182.77326: results queue empty 13355 1727096182.77327: checking for any_errors_fatal 13355 1727096182.77333: done checking for any_errors_fatal 13355 1727096182.77334: checking for max_fail_percentage 13355 1727096182.77336: done checking for max_fail_percentage 13355 1727096182.77337: checking to see if all hosts have failed and the running result is not ok 13355 1727096182.77338: done checking to see if all hosts have failed 13355 1727096182.77338: getting the remaining hosts for this loop 13355 1727096182.77340: done getting the remaining hosts for this loop 13355 1727096182.77343: getting the next task for host managed_node3 13355 1727096182.77351: done getting next task for host managed_node3 13355 1727096182.77358: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096182.77361: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096182.77386: getting variables 13355 1727096182.77387: in VariableManager get_vars() 13355 1727096182.77443: Calling all_inventory to load vars for managed_node3 13355 1727096182.77446: Calling groups_inventory to load vars for managed_node3 13355 1727096182.77449: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.77460: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.77463: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.77466: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.80198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096182.83734: done with get_vars() 13355 1727096182.83769: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:56:22 -0400 (0:00:00.103) 0:00:32.101 ****** 13355 1727096182.84079: entering _queue_task() for managed_node3/include_tasks 13355 1727096182.84663: worker is 1 (out of 1 available) 13355 1727096182.84782: exiting _queue_task() for managed_node3/include_tasks 13355 1727096182.84795: done queuing things up, now waiting for results queue to drain 13355 1727096182.84797: waiting for pending results... 13355 1727096182.85196: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096182.85877: in run() - task 0afff68d-5257-c514-593f-0000000000cf 13355 1727096182.85881: variable 'ansible_search_path' from source: unknown 13355 1727096182.85884: variable 'ansible_search_path' from source: unknown 13355 1727096182.85887: calling self._execute() 13355 1727096182.86057: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096182.86182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096182.86198: variable 'omit' from source: magic vars 13355 1727096182.87265: variable 'ansible_distribution_major_version' from source: facts 13355 1727096182.87279: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096182.87287: _execute() done 13355 1727096182.87291: dumping result to json 13355 1727096182.87293: done dumping result, returning 13355 1727096182.87303: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-c514-593f-0000000000cf] 13355 1727096182.87308: sending task result for task 0afff68d-5257-c514-593f-0000000000cf 13355 1727096182.87651: no more pending results, returning what we have 13355 1727096182.87657: in VariableManager get_vars() 13355 1727096182.87720: Calling all_inventory to load vars for managed_node3 13355 1727096182.87723: Calling groups_inventory to load vars for managed_node3 13355 1727096182.87725: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.87737: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.87739: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.87742: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.88610: done sending task result for task 0afff68d-5257-c514-593f-0000000000cf 13355 1727096182.88614: WORKER PROCESS EXITING 13355 1727096182.90593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096182.93792: done with get_vars() 13355 1727096182.93826: variable 'ansible_search_path' from source: unknown 13355 1727096182.93827: variable 'ansible_search_path' from source: unknown 13355 1727096182.93875: we have included files to process 13355 1727096182.93876: generating all_blocks data 13355 1727096182.93879: done generating all_blocks data 13355 1727096182.93885: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096182.93886: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096182.93888: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096182.95107: done processing included file 13355 1727096182.95110: iterating over new_blocks loaded from include file 13355 1727096182.95112: in VariableManager get_vars() 13355 1727096182.95148: done with get_vars() 13355 1727096182.95149: filtering new block on tags 13355 1727096182.95372: done filtering new block on tags 13355 1727096182.95377: in VariableManager get_vars() 13355 1727096182.95412: done with get_vars() 13355 1727096182.95414: filtering new block on tags 13355 1727096182.95438: done filtering new block on tags 13355 1727096182.95440: in VariableManager get_vars() 13355 1727096182.95470: done with get_vars() 13355 1727096182.95472: filtering new block on tags 13355 1727096182.95490: done filtering new block on tags 13355 1727096182.95492: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13355 1727096182.95497: extending task lists for all hosts with included blocks 13355 1727096182.97125: done extending task lists 13355 1727096182.97127: done processing included files 13355 1727096182.97128: results queue empty 13355 1727096182.97129: checking for any_errors_fatal 13355 1727096182.97132: done checking for any_errors_fatal 13355 1727096182.97133: checking for max_fail_percentage 13355 1727096182.97134: done checking for max_fail_percentage 13355 1727096182.97135: checking to see if all hosts have failed and the running result is not ok 13355 1727096182.97136: done checking to see if all hosts have failed 13355 1727096182.97137: getting the remaining hosts for this loop 13355 1727096182.97138: done getting the remaining hosts for this loop 13355 1727096182.97141: getting the next task for host managed_node3 13355 1727096182.97146: done getting next task for host managed_node3 13355 1727096182.97149: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096182.97153: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096182.97166: getting variables 13355 1727096182.97372: in VariableManager get_vars() 13355 1727096182.97402: Calling all_inventory to load vars for managed_node3 13355 1727096182.97405: Calling groups_inventory to load vars for managed_node3 13355 1727096182.97407: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096182.97415: Calling all_plugins_play to load vars for managed_node3 13355 1727096182.97418: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096182.97421: Calling groups_plugins_play to load vars for managed_node3 13355 1727096182.99944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096183.03394: done with get_vars() 13355 1727096183.03426: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:56:23 -0400 (0:00:00.194) 0:00:32.295 ****** 13355 1727096183.03504: entering _queue_task() for managed_node3/setup 13355 1727096183.04279: worker is 1 (out of 1 available) 13355 1727096183.04292: exiting _queue_task() for managed_node3/setup 13355 1727096183.04305: done queuing things up, now waiting for results queue to drain 13355 1727096183.04307: waiting for pending results... 13355 1727096183.04842: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096183.05222: in run() - task 0afff68d-5257-c514-593f-000000000796 13355 1727096183.05242: variable 'ansible_search_path' from source: unknown 13355 1727096183.05249: variable 'ansible_search_path' from source: unknown 13355 1727096183.05294: calling self._execute() 13355 1727096183.05775: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096183.05780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096183.05783: variable 'omit' from source: magic vars 13355 1727096183.06471: variable 'ansible_distribution_major_version' from source: facts 13355 1727096183.06774: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096183.07027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096183.11272: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096183.11396: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096183.11445: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096183.11532: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096183.11573: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096183.11671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096183.11712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096183.11760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096183.11821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096183.11891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096183.11953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096183.11988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096183.12027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096183.12087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096183.12122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096183.12327: variable '__network_required_facts' from source: role '' defaults 13355 1727096183.12330: variable 'ansible_facts' from source: unknown 13355 1727096183.13237: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13355 1727096183.13241: when evaluation is False, skipping this task 13355 1727096183.13244: _execute() done 13355 1727096183.13246: dumping result to json 13355 1727096183.13248: done dumping result, returning 13355 1727096183.13266: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-c514-593f-000000000796] 13355 1727096183.13279: sending task result for task 0afff68d-5257-c514-593f-000000000796 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096183.13690: no more pending results, returning what we have 13355 1727096183.13695: results queue empty 13355 1727096183.13696: checking for any_errors_fatal 13355 1727096183.13698: done checking for any_errors_fatal 13355 1727096183.13699: checking for max_fail_percentage 13355 1727096183.13701: done checking for max_fail_percentage 13355 1727096183.13702: checking to see if all hosts have failed and the running result is not ok 13355 1727096183.13703: done checking to see if all hosts have failed 13355 1727096183.13703: getting the remaining hosts for this loop 13355 1727096183.13705: done getting the remaining hosts for this loop 13355 1727096183.13709: getting the next task for host managed_node3 13355 1727096183.13720: done getting next task for host managed_node3 13355 1727096183.13725: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096183.13730: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096183.13753: getting variables 13355 1727096183.13754: in VariableManager get_vars() 13355 1727096183.13933: Calling all_inventory to load vars for managed_node3 13355 1727096183.13936: Calling groups_inventory to load vars for managed_node3 13355 1727096183.13938: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096183.13948: Calling all_plugins_play to load vars for managed_node3 13355 1727096183.13951: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096183.13953: Calling groups_plugins_play to load vars for managed_node3 13355 1727096183.14541: done sending task result for task 0afff68d-5257-c514-593f-000000000796 13355 1727096183.14544: WORKER PROCESS EXITING 13355 1727096183.15497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096183.17163: done with get_vars() 13355 1727096183.17195: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:56:23 -0400 (0:00:00.137) 0:00:32.433 ****** 13355 1727096183.17317: entering _queue_task() for managed_node3/stat 13355 1727096183.17778: worker is 1 (out of 1 available) 13355 1727096183.17791: exiting _queue_task() for managed_node3/stat 13355 1727096183.17804: done queuing things up, now waiting for results queue to drain 13355 1727096183.17805: waiting for pending results... 13355 1727096183.18046: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096183.18225: in run() - task 0afff68d-5257-c514-593f-000000000798 13355 1727096183.18248: variable 'ansible_search_path' from source: unknown 13355 1727096183.18259: variable 'ansible_search_path' from source: unknown 13355 1727096183.18309: calling self._execute() 13355 1727096183.18430: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096183.18442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096183.18454: variable 'omit' from source: magic vars 13355 1727096183.18947: variable 'ansible_distribution_major_version' from source: facts 13355 1727096183.18951: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096183.19145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096183.19466: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096183.19542: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096183.19597: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096183.19643: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096183.19787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096183.19790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096183.19807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096183.19859: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096183.19970: variable '__network_is_ostree' from source: set_fact 13355 1727096183.19983: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096183.19990: when evaluation is False, skipping this task 13355 1727096183.20042: _execute() done 13355 1727096183.20045: dumping result to json 13355 1727096183.20048: done dumping result, returning 13355 1727096183.20050: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-c514-593f-000000000798] 13355 1727096183.20052: sending task result for task 0afff68d-5257-c514-593f-000000000798 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096183.20203: no more pending results, returning what we have 13355 1727096183.20209: results queue empty 13355 1727096183.20210: checking for any_errors_fatal 13355 1727096183.20217: done checking for any_errors_fatal 13355 1727096183.20218: checking for max_fail_percentage 13355 1727096183.20220: done checking for max_fail_percentage 13355 1727096183.20221: checking to see if all hosts have failed and the running result is not ok 13355 1727096183.20222: done checking to see if all hosts have failed 13355 1727096183.20223: getting the remaining hosts for this loop 13355 1727096183.20225: done getting the remaining hosts for this loop 13355 1727096183.20228: getting the next task for host managed_node3 13355 1727096183.20238: done getting next task for host managed_node3 13355 1727096183.20242: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096183.20246: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096183.20274: getting variables 13355 1727096183.20276: in VariableManager get_vars() 13355 1727096183.20336: Calling all_inventory to load vars for managed_node3 13355 1727096183.20339: Calling groups_inventory to load vars for managed_node3 13355 1727096183.20342: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096183.20353: Calling all_plugins_play to load vars for managed_node3 13355 1727096183.20359: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096183.20364: Calling groups_plugins_play to load vars for managed_node3 13355 1727096183.20592: done sending task result for task 0afff68d-5257-c514-593f-000000000798 13355 1727096183.20596: WORKER PROCESS EXITING 13355 1727096183.23838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096183.27632: done with get_vars() 13355 1727096183.27673: done getting variables 13355 1727096183.27738: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:56:23 -0400 (0:00:00.106) 0:00:32.540 ****** 13355 1727096183.27985: entering _queue_task() for managed_node3/set_fact 13355 1727096183.28590: worker is 1 (out of 1 available) 13355 1727096183.28603: exiting _queue_task() for managed_node3/set_fact 13355 1727096183.28616: done queuing things up, now waiting for results queue to drain 13355 1727096183.28617: waiting for pending results... 13355 1727096183.29236: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096183.29562: in run() - task 0afff68d-5257-c514-593f-000000000799 13355 1727096183.29637: variable 'ansible_search_path' from source: unknown 13355 1727096183.29646: variable 'ansible_search_path' from source: unknown 13355 1727096183.29875: calling self._execute() 13355 1727096183.29916: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096183.29988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096183.30003: variable 'omit' from source: magic vars 13355 1727096183.30980: variable 'ansible_distribution_major_version' from source: facts 13355 1727096183.31120: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096183.31430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096183.32036: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096183.32132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096183.32374: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096183.32378: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096183.32574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096183.32577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096183.32763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096183.32811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096183.32919: variable '__network_is_ostree' from source: set_fact 13355 1727096183.33028: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096183.33038: when evaluation is False, skipping this task 13355 1727096183.33045: _execute() done 13355 1727096183.33052: dumping result to json 13355 1727096183.33070: done dumping result, returning 13355 1727096183.33084: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-c514-593f-000000000799] 13355 1727096183.33185: sending task result for task 0afff68d-5257-c514-593f-000000000799 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096183.33354: no more pending results, returning what we have 13355 1727096183.33361: results queue empty 13355 1727096183.33362: checking for any_errors_fatal 13355 1727096183.33372: done checking for any_errors_fatal 13355 1727096183.33373: checking for max_fail_percentage 13355 1727096183.33374: done checking for max_fail_percentage 13355 1727096183.33375: checking to see if all hosts have failed and the running result is not ok 13355 1727096183.33376: done checking to see if all hosts have failed 13355 1727096183.33377: getting the remaining hosts for this loop 13355 1727096183.33378: done getting the remaining hosts for this loop 13355 1727096183.33382: getting the next task for host managed_node3 13355 1727096183.33398: done getting next task for host managed_node3 13355 1727096183.33403: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096183.33407: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096183.33434: getting variables 13355 1727096183.33435: in VariableManager get_vars() 13355 1727096183.33701: Calling all_inventory to load vars for managed_node3 13355 1727096183.33705: Calling groups_inventory to load vars for managed_node3 13355 1727096183.33707: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096183.33718: done sending task result for task 0afff68d-5257-c514-593f-000000000799 13355 1727096183.33721: WORKER PROCESS EXITING 13355 1727096183.33731: Calling all_plugins_play to load vars for managed_node3 13355 1727096183.33734: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096183.33737: Calling groups_plugins_play to load vars for managed_node3 13355 1727096183.36821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096183.40613: done with get_vars() 13355 1727096183.40644: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:56:23 -0400 (0:00:00.128) 0:00:32.669 ****** 13355 1727096183.40859: entering _queue_task() for managed_node3/service_facts 13355 1727096183.41662: worker is 1 (out of 1 available) 13355 1727096183.41786: exiting _queue_task() for managed_node3/service_facts 13355 1727096183.41800: done queuing things up, now waiting for results queue to drain 13355 1727096183.41801: waiting for pending results... 13355 1727096183.42330: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096183.42558: in run() - task 0afff68d-5257-c514-593f-00000000079b 13355 1727096183.42575: variable 'ansible_search_path' from source: unknown 13355 1727096183.42578: variable 'ansible_search_path' from source: unknown 13355 1727096183.42635: calling self._execute() 13355 1727096183.42822: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096183.42826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096183.42834: variable 'omit' from source: magic vars 13355 1727096183.43880: variable 'ansible_distribution_major_version' from source: facts 13355 1727096183.43884: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096183.43887: variable 'omit' from source: magic vars 13355 1727096183.44006: variable 'omit' from source: magic vars 13355 1727096183.44109: variable 'omit' from source: magic vars 13355 1727096183.44236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096183.44279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096183.44300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096183.44318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096183.44330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096183.44495: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096183.44498: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096183.44530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096183.44731: Set connection var ansible_shell_executable to /bin/sh 13355 1727096183.44737: Set connection var ansible_shell_type to sh 13355 1727096183.44860: Set connection var ansible_pipelining to False 13355 1727096183.44863: Set connection var ansible_connection to ssh 13355 1727096183.44865: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096183.44870: Set connection var ansible_timeout to 10 13355 1727096183.44887: variable 'ansible_shell_executable' from source: unknown 13355 1727096183.44890: variable 'ansible_connection' from source: unknown 13355 1727096183.44893: variable 'ansible_module_compression' from source: unknown 13355 1727096183.44901: variable 'ansible_shell_type' from source: unknown 13355 1727096183.44904: variable 'ansible_shell_executable' from source: unknown 13355 1727096183.44906: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096183.44912: variable 'ansible_pipelining' from source: unknown 13355 1727096183.44914: variable 'ansible_timeout' from source: unknown 13355 1727096183.44919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096183.45298: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096183.45302: variable 'omit' from source: magic vars 13355 1727096183.45311: starting attempt loop 13355 1727096183.45314: running the handler 13355 1727096183.45329: _low_level_execute_command(): starting 13355 1727096183.45339: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096183.46762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096183.46833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096183.46882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096183.46888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096183.47046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096183.48737: stdout chunk (state=3): >>>/root <<< 13355 1727096183.48844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096183.48897: stderr chunk (state=3): >>><<< 13355 1727096183.48900: stdout chunk (state=3): >>><<< 13355 1727096183.49077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096183.49080: _low_level_execute_command(): starting 13355 1727096183.49084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644 `" && echo ansible-tmp-1727096183.49036-14771-80804045632644="` echo /root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644 `" ) && sleep 0' 13355 1727096183.50379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096183.50383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096183.50504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096183.50517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096183.50605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096183.50808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096183.50928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096183.52888: stdout chunk (state=3): >>>ansible-tmp-1727096183.49036-14771-80804045632644=/root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644 <<< 13355 1727096183.52995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096183.53017: stderr chunk (state=3): >>><<< 13355 1727096183.53020: stdout chunk (state=3): >>><<< 13355 1727096183.53037: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096183.49036-14771-80804045632644=/root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096183.53082: variable 'ansible_module_compression' from source: unknown 13355 1727096183.53120: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13355 1727096183.53152: variable 'ansible_facts' from source: unknown 13355 1727096183.53213: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/AnsiballZ_service_facts.py 13355 1727096183.53323: Sending initial data 13355 1727096183.53326: Sent initial data (159 bytes) 13355 1727096183.53753: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096183.53792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096183.53796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096183.53799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096183.53880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096183.53924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096183.55575: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096183.55602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096183.55637: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpbgnr65lm /root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/AnsiballZ_service_facts.py <<< 13355 1727096183.55640: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/AnsiballZ_service_facts.py" <<< 13355 1727096183.55670: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpbgnr65lm" to remote "/root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/AnsiballZ_service_facts.py" <<< 13355 1727096183.56182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096183.56226: stderr chunk (state=3): >>><<< 13355 1727096183.56229: stdout chunk (state=3): >>><<< 13355 1727096183.56281: done transferring module to remote 13355 1727096183.56293: _low_level_execute_command(): starting 13355 1727096183.56298: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/ /root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/AnsiballZ_service_facts.py && sleep 0' 13355 1727096183.56762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096183.56819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096183.56822: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096183.56825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096183.56918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096183.56926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096183.58876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096183.58880: stderr chunk (state=3): >>><<< 13355 1727096183.58882: stdout chunk (state=3): >>><<< 13355 1727096183.58885: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096183.58887: _low_level_execute_command(): starting 13355 1727096183.58889: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/AnsiballZ_service_facts.py && sleep 0' 13355 1727096183.59344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096183.59348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096183.59350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096183.59352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096183.59408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096183.59412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096183.59414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096183.59459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096185.28526: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 13355 1727096185.28587: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13355 1727096185.30312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096185.30337: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 13355 1727096185.30419: stderr chunk (state=3): >>><<< 13355 1727096185.30422: stdout chunk (state=3): >>><<< 13355 1727096185.30577: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096185.31252: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096185.31274: _low_level_execute_command(): starting 13355 1727096185.31284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096183.49036-14771-80804045632644/ > /dev/null 2>&1 && sleep 0' 13355 1727096185.31944: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096185.31961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096185.31978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096185.31995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096185.32109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096185.32139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096185.32212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096185.34147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096185.34164: stderr chunk (state=3): >>><<< 13355 1727096185.34174: stdout chunk (state=3): >>><<< 13355 1727096185.34198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096185.34220: handler run complete 13355 1727096185.34685: variable 'ansible_facts' from source: unknown 13355 1727096185.34879: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096185.35602: variable 'ansible_facts' from source: unknown 13355 1727096185.35731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096185.36133: attempt loop complete, returning result 13355 1727096185.36137: _execute() done 13355 1727096185.36139: dumping result to json 13355 1727096185.36330: done dumping result, returning 13355 1727096185.36334: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-c514-593f-00000000079b] 13355 1727096185.36336: sending task result for task 0afff68d-5257-c514-593f-00000000079b ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096185.37315: no more pending results, returning what we have 13355 1727096185.37318: results queue empty 13355 1727096185.37319: checking for any_errors_fatal 13355 1727096185.37322: done checking for any_errors_fatal 13355 1727096185.37323: checking for max_fail_percentage 13355 1727096185.37325: done checking for max_fail_percentage 13355 1727096185.37326: checking to see if all hosts have failed and the running result is not ok 13355 1727096185.37326: done checking to see if all hosts have failed 13355 1727096185.37327: getting the remaining hosts for this loop 13355 1727096185.37328: done getting the remaining hosts for this loop 13355 1727096185.37331: getting the next task for host managed_node3 13355 1727096185.37337: done getting next task for host managed_node3 13355 1727096185.37340: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096185.37344: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096185.37354: getting variables 13355 1727096185.37358: in VariableManager get_vars() 13355 1727096185.37402: Calling all_inventory to load vars for managed_node3 13355 1727096185.37405: Calling groups_inventory to load vars for managed_node3 13355 1727096185.37407: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096185.37416: Calling all_plugins_play to load vars for managed_node3 13355 1727096185.37419: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096185.37421: Calling groups_plugins_play to load vars for managed_node3 13355 1727096185.37981: done sending task result for task 0afff68d-5257-c514-593f-00000000079b 13355 1727096185.37985: WORKER PROCESS EXITING 13355 1727096185.40005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096185.52316: done with get_vars() 13355 1727096185.52354: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:56:25 -0400 (0:00:02.115) 0:00:34.785 ****** 13355 1727096185.52439: entering _queue_task() for managed_node3/package_facts 13355 1727096185.52840: worker is 1 (out of 1 available) 13355 1727096185.52852: exiting _queue_task() for managed_node3/package_facts 13355 1727096185.52864: done queuing things up, now waiting for results queue to drain 13355 1727096185.52865: waiting for pending results... 13355 1727096185.53157: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096185.53331: in run() - task 0afff68d-5257-c514-593f-00000000079c 13355 1727096185.53353: variable 'ansible_search_path' from source: unknown 13355 1727096185.53362: variable 'ansible_search_path' from source: unknown 13355 1727096185.53414: calling self._execute() 13355 1727096185.53523: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096185.53535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096185.53621: variable 'omit' from source: magic vars 13355 1727096185.53973: variable 'ansible_distribution_major_version' from source: facts 13355 1727096185.53991: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096185.54002: variable 'omit' from source: magic vars 13355 1727096185.54091: variable 'omit' from source: magic vars 13355 1727096185.54129: variable 'omit' from source: magic vars 13355 1727096185.54184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096185.54226: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096185.54252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096185.54282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096185.54297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096185.54331: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096185.54339: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096185.54384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096185.54458: Set connection var ansible_shell_executable to /bin/sh 13355 1727096185.54475: Set connection var ansible_shell_type to sh 13355 1727096185.54493: Set connection var ansible_pipelining to False 13355 1727096185.54503: Set connection var ansible_connection to ssh 13355 1727096185.54512: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096185.54574: Set connection var ansible_timeout to 10 13355 1727096185.54577: variable 'ansible_shell_executable' from source: unknown 13355 1727096185.54580: variable 'ansible_connection' from source: unknown 13355 1727096185.54583: variable 'ansible_module_compression' from source: unknown 13355 1727096185.54585: variable 'ansible_shell_type' from source: unknown 13355 1727096185.54588: variable 'ansible_shell_executable' from source: unknown 13355 1727096185.54590: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096185.54592: variable 'ansible_pipelining' from source: unknown 13355 1727096185.54602: variable 'ansible_timeout' from source: unknown 13355 1727096185.54604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096185.54809: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096185.54834: variable 'omit' from source: magic vars 13355 1727096185.54843: starting attempt loop 13355 1727096185.54849: running the handler 13355 1727096185.54929: _low_level_execute_command(): starting 13355 1727096185.54932: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096185.55700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096185.55759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096185.55937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096185.56210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096185.57910: stdout chunk (state=3): >>>/root <<< 13355 1727096185.58063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096185.58070: stdout chunk (state=3): >>><<< 13355 1727096185.58074: stderr chunk (state=3): >>><<< 13355 1727096185.58200: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096185.58203: _low_level_execute_command(): starting 13355 1727096185.58206: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764 `" && echo ansible-tmp-1727096185.5809863-14847-183864877019764="` echo /root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764 `" ) && sleep 0' 13355 1727096185.58742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096185.58760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096185.58793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096185.58873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096185.58893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096185.58950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096185.58971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096185.58996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096185.59075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096185.61097: stdout chunk (state=3): >>>ansible-tmp-1727096185.5809863-14847-183864877019764=/root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764 <<< 13355 1727096185.61275: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096185.61280: stdout chunk (state=3): >>><<< 13355 1727096185.61283: stderr chunk (state=3): >>><<< 13355 1727096185.61474: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096185.5809863-14847-183864877019764=/root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096185.61477: variable 'ansible_module_compression' from source: unknown 13355 1727096185.61480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13355 1727096185.61502: variable 'ansible_facts' from source: unknown 13355 1727096185.61683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/AnsiballZ_package_facts.py 13355 1727096185.61954: Sending initial data 13355 1727096185.61957: Sent initial data (162 bytes) 13355 1727096185.62701: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096185.62809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096185.62895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096185.63012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096185.63039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096185.63120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096185.64777: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096185.64832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096185.64854: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmplfrngir6 /root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/AnsiballZ_package_facts.py <<< 13355 1727096185.64858: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/AnsiballZ_package_facts.py" <<< 13355 1727096185.64922: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmplfrngir6" to remote "/root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/AnsiballZ_package_facts.py" <<< 13355 1727096185.66575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096185.66579: stderr chunk (state=3): >>><<< 13355 1727096185.66581: stdout chunk (state=3): >>><<< 13355 1727096185.66583: done transferring module to remote 13355 1727096185.66585: _low_level_execute_command(): starting 13355 1727096185.66587: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/ /root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/AnsiballZ_package_facts.py && sleep 0' 13355 1727096185.67613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096185.67820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096185.69976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096185.69981: stdout chunk (state=3): >>><<< 13355 1727096185.69983: stderr chunk (state=3): >>><<< 13355 1727096185.69985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096185.69992: _low_level_execute_command(): starting 13355 1727096185.69994: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/AnsiballZ_package_facts.py && sleep 0' 13355 1727096185.70473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096185.70484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096185.70495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096185.70510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096185.70523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096185.70539: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096185.70549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096185.70566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096185.70578: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096185.70585: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096185.70593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096185.70648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096185.70690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096185.70712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096185.70722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096185.70803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096186.17316: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 13355 1727096186.17341: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 13355 1727096186.17381: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 13355 1727096186.17402: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 13355 1727096186.17421: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 13355 1727096186.17459: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 13355 1727096186.17477: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 13355 1727096186.17496: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13355 1727096186.19276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096186.19306: stderr chunk (state=3): >>><<< 13355 1727096186.19309: stdout chunk (state=3): >>><<< 13355 1727096186.19355: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096186.20548: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096186.20569: _low_level_execute_command(): starting 13355 1727096186.20574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096185.5809863-14847-183864877019764/ > /dev/null 2>&1 && sleep 0' 13355 1727096186.21030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096186.21039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096186.21060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096186.21063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096186.21121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096186.21124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096186.21132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096186.21163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096186.23106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096186.23110: stdout chunk (state=3): >>><<< 13355 1727096186.23112: stderr chunk (state=3): >>><<< 13355 1727096186.23117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096186.23125: handler run complete 13355 1727096186.23692: variable 'ansible_facts' from source: unknown 13355 1727096186.23954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.25073: variable 'ansible_facts' from source: unknown 13355 1727096186.25576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.26090: attempt loop complete, returning result 13355 1727096186.26102: _execute() done 13355 1727096186.26105: dumping result to json 13355 1727096186.26307: done dumping result, returning 13355 1727096186.26318: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-c514-593f-00000000079c] 13355 1727096186.26323: sending task result for task 0afff68d-5257-c514-593f-00000000079c 13355 1727096186.28669: done sending task result for task 0afff68d-5257-c514-593f-00000000079c 13355 1727096186.28673: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096186.28821: no more pending results, returning what we have 13355 1727096186.28824: results queue empty 13355 1727096186.28825: checking for any_errors_fatal 13355 1727096186.28830: done checking for any_errors_fatal 13355 1727096186.28831: checking for max_fail_percentage 13355 1727096186.28832: done checking for max_fail_percentage 13355 1727096186.28833: checking to see if all hosts have failed and the running result is not ok 13355 1727096186.28834: done checking to see if all hosts have failed 13355 1727096186.28835: getting the remaining hosts for this loop 13355 1727096186.28836: done getting the remaining hosts for this loop 13355 1727096186.28839: getting the next task for host managed_node3 13355 1727096186.28851: done getting next task for host managed_node3 13355 1727096186.28855: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096186.28857: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096186.28873: getting variables 13355 1727096186.28875: in VariableManager get_vars() 13355 1727096186.28916: Calling all_inventory to load vars for managed_node3 13355 1727096186.28919: Calling groups_inventory to load vars for managed_node3 13355 1727096186.28921: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096186.28930: Calling all_plugins_play to load vars for managed_node3 13355 1727096186.28932: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096186.28935: Calling groups_plugins_play to load vars for managed_node3 13355 1727096186.30269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.32058: done with get_vars() 13355 1727096186.32092: done getting variables 13355 1727096186.32242: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:56:26 -0400 (0:00:00.798) 0:00:35.583 ****** 13355 1727096186.32296: entering _queue_task() for managed_node3/debug 13355 1727096186.32618: worker is 1 (out of 1 available) 13355 1727096186.32631: exiting _queue_task() for managed_node3/debug 13355 1727096186.32645: done queuing things up, now waiting for results queue to drain 13355 1727096186.32647: waiting for pending results... 13355 1727096186.32821: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096186.32917: in run() - task 0afff68d-5257-c514-593f-0000000000d0 13355 1727096186.32929: variable 'ansible_search_path' from source: unknown 13355 1727096186.32932: variable 'ansible_search_path' from source: unknown 13355 1727096186.32962: calling self._execute() 13355 1727096186.33044: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.33047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.33059: variable 'omit' from source: magic vars 13355 1727096186.33340: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.33349: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096186.33354: variable 'omit' from source: magic vars 13355 1727096186.33397: variable 'omit' from source: magic vars 13355 1727096186.33465: variable 'network_provider' from source: set_fact 13355 1727096186.33481: variable 'omit' from source: magic vars 13355 1727096186.33511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096186.33541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096186.33558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096186.33571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096186.33581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096186.33604: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096186.33606: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.33609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.33680: Set connection var ansible_shell_executable to /bin/sh 13355 1727096186.33684: Set connection var ansible_shell_type to sh 13355 1727096186.33690: Set connection var ansible_pipelining to False 13355 1727096186.33694: Set connection var ansible_connection to ssh 13355 1727096186.33699: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096186.33704: Set connection var ansible_timeout to 10 13355 1727096186.33722: variable 'ansible_shell_executable' from source: unknown 13355 1727096186.33726: variable 'ansible_connection' from source: unknown 13355 1727096186.33728: variable 'ansible_module_compression' from source: unknown 13355 1727096186.33730: variable 'ansible_shell_type' from source: unknown 13355 1727096186.33732: variable 'ansible_shell_executable' from source: unknown 13355 1727096186.33735: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.33744: variable 'ansible_pipelining' from source: unknown 13355 1727096186.33747: variable 'ansible_timeout' from source: unknown 13355 1727096186.33749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.33841: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096186.33850: variable 'omit' from source: magic vars 13355 1727096186.33860: starting attempt loop 13355 1727096186.33863: running the handler 13355 1727096186.33896: handler run complete 13355 1727096186.33909: attempt loop complete, returning result 13355 1727096186.33912: _execute() done 13355 1727096186.33914: dumping result to json 13355 1727096186.33917: done dumping result, returning 13355 1727096186.33924: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-c514-593f-0000000000d0] 13355 1727096186.33929: sending task result for task 0afff68d-5257-c514-593f-0000000000d0 13355 1727096186.34014: done sending task result for task 0afff68d-5257-c514-593f-0000000000d0 13355 1727096186.34017: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 13355 1727096186.34086: no more pending results, returning what we have 13355 1727096186.34089: results queue empty 13355 1727096186.34090: checking for any_errors_fatal 13355 1727096186.34100: done checking for any_errors_fatal 13355 1727096186.34101: checking for max_fail_percentage 13355 1727096186.34103: done checking for max_fail_percentage 13355 1727096186.34104: checking to see if all hosts have failed and the running result is not ok 13355 1727096186.34105: done checking to see if all hosts have failed 13355 1727096186.34105: getting the remaining hosts for this loop 13355 1727096186.34106: done getting the remaining hosts for this loop 13355 1727096186.34109: getting the next task for host managed_node3 13355 1727096186.34115: done getting next task for host managed_node3 13355 1727096186.34118: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096186.34121: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096186.34184: getting variables 13355 1727096186.34186: in VariableManager get_vars() 13355 1727096186.34228: Calling all_inventory to load vars for managed_node3 13355 1727096186.34230: Calling groups_inventory to load vars for managed_node3 13355 1727096186.34233: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096186.34246: Calling all_plugins_play to load vars for managed_node3 13355 1727096186.34252: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096186.34258: Calling groups_plugins_play to load vars for managed_node3 13355 1727096186.35460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.36337: done with get_vars() 13355 1727096186.36354: done getting variables 13355 1727096186.36398: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:56:26 -0400 (0:00:00.041) 0:00:35.624 ****** 13355 1727096186.36424: entering _queue_task() for managed_node3/fail 13355 1727096186.36666: worker is 1 (out of 1 available) 13355 1727096186.36682: exiting _queue_task() for managed_node3/fail 13355 1727096186.36694: done queuing things up, now waiting for results queue to drain 13355 1727096186.36696: waiting for pending results... 13355 1727096186.36871: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096186.36958: in run() - task 0afff68d-5257-c514-593f-0000000000d1 13355 1727096186.36970: variable 'ansible_search_path' from source: unknown 13355 1727096186.36974: variable 'ansible_search_path' from source: unknown 13355 1727096186.37002: calling self._execute() 13355 1727096186.37081: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.37085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.37092: variable 'omit' from source: magic vars 13355 1727096186.37441: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.37462: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096186.37592: variable 'network_state' from source: role '' defaults 13355 1727096186.37609: Evaluated conditional (network_state != {}): False 13355 1727096186.37617: when evaluation is False, skipping this task 13355 1727096186.37624: _execute() done 13355 1727096186.37631: dumping result to json 13355 1727096186.37637: done dumping result, returning 13355 1727096186.37647: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-c514-593f-0000000000d1] 13355 1727096186.37664: sending task result for task 0afff68d-5257-c514-593f-0000000000d1 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096186.38012: no more pending results, returning what we have 13355 1727096186.38016: results queue empty 13355 1727096186.38017: checking for any_errors_fatal 13355 1727096186.38023: done checking for any_errors_fatal 13355 1727096186.38024: checking for max_fail_percentage 13355 1727096186.38025: done checking for max_fail_percentage 13355 1727096186.38026: checking to see if all hosts have failed and the running result is not ok 13355 1727096186.38027: done checking to see if all hosts have failed 13355 1727096186.38028: getting the remaining hosts for this loop 13355 1727096186.38029: done getting the remaining hosts for this loop 13355 1727096186.38033: getting the next task for host managed_node3 13355 1727096186.38039: done getting next task for host managed_node3 13355 1727096186.38043: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096186.38045: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096186.38070: getting variables 13355 1727096186.38072: in VariableManager get_vars() 13355 1727096186.38120: Calling all_inventory to load vars for managed_node3 13355 1727096186.38122: Calling groups_inventory to load vars for managed_node3 13355 1727096186.38125: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096186.38134: Calling all_plugins_play to load vars for managed_node3 13355 1727096186.38137: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096186.38139: Calling groups_plugins_play to load vars for managed_node3 13355 1727096186.38680: done sending task result for task 0afff68d-5257-c514-593f-0000000000d1 13355 1727096186.38683: WORKER PROCESS EXITING 13355 1727096186.39321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.40173: done with get_vars() 13355 1727096186.40189: done getting variables 13355 1727096186.40234: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:56:26 -0400 (0:00:00.038) 0:00:35.663 ****** 13355 1727096186.40270: entering _queue_task() for managed_node3/fail 13355 1727096186.40618: worker is 1 (out of 1 available) 13355 1727096186.40629: exiting _queue_task() for managed_node3/fail 13355 1727096186.40642: done queuing things up, now waiting for results queue to drain 13355 1727096186.40643: waiting for pending results... 13355 1727096186.40970: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096186.41072: in run() - task 0afff68d-5257-c514-593f-0000000000d2 13355 1727096186.41100: variable 'ansible_search_path' from source: unknown 13355 1727096186.41104: variable 'ansible_search_path' from source: unknown 13355 1727096186.41208: calling self._execute() 13355 1727096186.41219: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.41222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.41325: variable 'omit' from source: magic vars 13355 1727096186.41628: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.41654: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096186.41861: variable 'network_state' from source: role '' defaults 13355 1727096186.41865: Evaluated conditional (network_state != {}): False 13355 1727096186.41869: when evaluation is False, skipping this task 13355 1727096186.41872: _execute() done 13355 1727096186.41874: dumping result to json 13355 1727096186.41878: done dumping result, returning 13355 1727096186.41881: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-c514-593f-0000000000d2] 13355 1727096186.41883: sending task result for task 0afff68d-5257-c514-593f-0000000000d2 13355 1727096186.41953: done sending task result for task 0afff68d-5257-c514-593f-0000000000d2 13355 1727096186.41962: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096186.42018: no more pending results, returning what we have 13355 1727096186.42023: results queue empty 13355 1727096186.42024: checking for any_errors_fatal 13355 1727096186.42032: done checking for any_errors_fatal 13355 1727096186.42033: checking for max_fail_percentage 13355 1727096186.42034: done checking for max_fail_percentage 13355 1727096186.42035: checking to see if all hosts have failed and the running result is not ok 13355 1727096186.42036: done checking to see if all hosts have failed 13355 1727096186.42037: getting the remaining hosts for this loop 13355 1727096186.42038: done getting the remaining hosts for this loop 13355 1727096186.42042: getting the next task for host managed_node3 13355 1727096186.42049: done getting next task for host managed_node3 13355 1727096186.42053: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096186.42059: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096186.42174: getting variables 13355 1727096186.42176: in VariableManager get_vars() 13355 1727096186.42225: Calling all_inventory to load vars for managed_node3 13355 1727096186.42228: Calling groups_inventory to load vars for managed_node3 13355 1727096186.42230: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096186.42242: Calling all_plugins_play to load vars for managed_node3 13355 1727096186.42245: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096186.42247: Calling groups_plugins_play to load vars for managed_node3 13355 1727096186.43056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.43926: done with get_vars() 13355 1727096186.43942: done getting variables 13355 1727096186.44008: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:56:26 -0400 (0:00:00.037) 0:00:35.701 ****** 13355 1727096186.44041: entering _queue_task() for managed_node3/fail 13355 1727096186.44354: worker is 1 (out of 1 available) 13355 1727096186.44571: exiting _queue_task() for managed_node3/fail 13355 1727096186.44582: done queuing things up, now waiting for results queue to drain 13355 1727096186.44584: waiting for pending results... 13355 1727096186.44896: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096186.44901: in run() - task 0afff68d-5257-c514-593f-0000000000d3 13355 1727096186.44904: variable 'ansible_search_path' from source: unknown 13355 1727096186.44907: variable 'ansible_search_path' from source: unknown 13355 1727096186.44911: calling self._execute() 13355 1727096186.44983: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.45009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.45047: variable 'omit' from source: magic vars 13355 1727096186.45326: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.45336: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096186.45458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096186.47176: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096186.47218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096186.47246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096186.47274: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096186.47294: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096186.47359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.47382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.47399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.47425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.47436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.47505: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.47518: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13355 1727096186.47599: variable 'ansible_distribution' from source: facts 13355 1727096186.47603: variable '__network_rh_distros' from source: role '' defaults 13355 1727096186.47610: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13355 1727096186.47777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.47793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.47812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.47839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.47849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.47885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.47902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.47919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.47944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.47955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.47988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.48005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.48021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.48046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.48059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.48244: variable 'network_connections' from source: task vars 13355 1727096186.48254: variable 'controller_profile' from source: play vars 13355 1727096186.48301: variable 'controller_profile' from source: play vars 13355 1727096186.48308: variable 'controller_device' from source: play vars 13355 1727096186.48351: variable 'controller_device' from source: play vars 13355 1727096186.48362: variable 'port1_profile' from source: play vars 13355 1727096186.48404: variable 'port1_profile' from source: play vars 13355 1727096186.48410: variable 'dhcp_interface1' from source: play vars 13355 1727096186.48452: variable 'dhcp_interface1' from source: play vars 13355 1727096186.48460: variable 'controller_profile' from source: play vars 13355 1727096186.48503: variable 'controller_profile' from source: play vars 13355 1727096186.48509: variable 'port2_profile' from source: play vars 13355 1727096186.48552: variable 'port2_profile' from source: play vars 13355 1727096186.48560: variable 'dhcp_interface2' from source: play vars 13355 1727096186.48603: variable 'dhcp_interface2' from source: play vars 13355 1727096186.48610: variable 'controller_profile' from source: play vars 13355 1727096186.48659: variable 'controller_profile' from source: play vars 13355 1727096186.48662: variable 'network_state' from source: role '' defaults 13355 1727096186.48711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096186.48883: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096186.48908: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096186.48932: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096186.48952: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096186.48992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096186.49005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096186.49023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.49043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096186.49074: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13355 1727096186.49078: when evaluation is False, skipping this task 13355 1727096186.49081: _execute() done 13355 1727096186.49083: dumping result to json 13355 1727096186.49085: done dumping result, returning 13355 1727096186.49277: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-c514-593f-0000000000d3] 13355 1727096186.49281: sending task result for task 0afff68d-5257-c514-593f-0000000000d3 13355 1727096186.49350: done sending task result for task 0afff68d-5257-c514-593f-0000000000d3 13355 1727096186.49354: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13355 1727096186.49476: no more pending results, returning what we have 13355 1727096186.49479: results queue empty 13355 1727096186.49479: checking for any_errors_fatal 13355 1727096186.49484: done checking for any_errors_fatal 13355 1727096186.49485: checking for max_fail_percentage 13355 1727096186.49486: done checking for max_fail_percentage 13355 1727096186.49487: checking to see if all hosts have failed and the running result is not ok 13355 1727096186.49488: done checking to see if all hosts have failed 13355 1727096186.49488: getting the remaining hosts for this loop 13355 1727096186.49490: done getting the remaining hosts for this loop 13355 1727096186.49493: getting the next task for host managed_node3 13355 1727096186.49498: done getting next task for host managed_node3 13355 1727096186.49502: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096186.49504: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096186.49523: getting variables 13355 1727096186.49524: in VariableManager get_vars() 13355 1727096186.49576: Calling all_inventory to load vars for managed_node3 13355 1727096186.49580: Calling groups_inventory to load vars for managed_node3 13355 1727096186.49582: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096186.49591: Calling all_plugins_play to load vars for managed_node3 13355 1727096186.49594: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096186.49597: Calling groups_plugins_play to load vars for managed_node3 13355 1727096186.51708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.53840: done with get_vars() 13355 1727096186.53884: done getting variables 13355 1727096186.53949: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:56:26 -0400 (0:00:00.099) 0:00:35.800 ****** 13355 1727096186.53998: entering _queue_task() for managed_node3/dnf 13355 1727096186.54326: worker is 1 (out of 1 available) 13355 1727096186.54338: exiting _queue_task() for managed_node3/dnf 13355 1727096186.54351: done queuing things up, now waiting for results queue to drain 13355 1727096186.54353: waiting for pending results... 13355 1727096186.54657: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096186.54748: in run() - task 0afff68d-5257-c514-593f-0000000000d4 13355 1727096186.54763: variable 'ansible_search_path' from source: unknown 13355 1727096186.54770: variable 'ansible_search_path' from source: unknown 13355 1727096186.54799: calling self._execute() 13355 1727096186.54881: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.54884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.54892: variable 'omit' from source: magic vars 13355 1727096186.55185: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.55195: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096186.55331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096186.57630: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096186.57677: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096186.57719: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096186.57766: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096186.57799: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096186.57955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.57958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.57973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.58016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.58034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.58171: variable 'ansible_distribution' from source: facts 13355 1727096186.58185: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.58204: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13355 1727096186.58331: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096186.58475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.58515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.58547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.58609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.58618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.58719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.58722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.58724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.58763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.58784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.58831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.58860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.58891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.58939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.58957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.59128: variable 'network_connections' from source: task vars 13355 1727096186.59153: variable 'controller_profile' from source: play vars 13355 1727096186.59263: variable 'controller_profile' from source: play vars 13355 1727096186.59266: variable 'controller_device' from source: play vars 13355 1727096186.59302: variable 'controller_device' from source: play vars 13355 1727096186.59317: variable 'port1_profile' from source: play vars 13355 1727096186.59385: variable 'port1_profile' from source: play vars 13355 1727096186.59396: variable 'dhcp_interface1' from source: play vars 13355 1727096186.59457: variable 'dhcp_interface1' from source: play vars 13355 1727096186.59474: variable 'controller_profile' from source: play vars 13355 1727096186.59538: variable 'controller_profile' from source: play vars 13355 1727096186.59582: variable 'port2_profile' from source: play vars 13355 1727096186.59624: variable 'port2_profile' from source: play vars 13355 1727096186.59635: variable 'dhcp_interface2' from source: play vars 13355 1727096186.59704: variable 'dhcp_interface2' from source: play vars 13355 1727096186.59716: variable 'controller_profile' from source: play vars 13355 1727096186.59799: variable 'controller_profile' from source: play vars 13355 1727096186.59865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096186.60127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096186.60130: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096186.60137: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096186.60173: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096186.60219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096186.60266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096186.60300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.60328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096186.60401: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096186.60657: variable 'network_connections' from source: task vars 13355 1727096186.60677: variable 'controller_profile' from source: play vars 13355 1727096186.60738: variable 'controller_profile' from source: play vars 13355 1727096186.60750: variable 'controller_device' from source: play vars 13355 1727096186.60817: variable 'controller_device' from source: play vars 13355 1727096186.60882: variable 'port1_profile' from source: play vars 13355 1727096186.60899: variable 'port1_profile' from source: play vars 13355 1727096186.60914: variable 'dhcp_interface1' from source: play vars 13355 1727096186.60973: variable 'dhcp_interface1' from source: play vars 13355 1727096186.60998: variable 'controller_profile' from source: play vars 13355 1727096186.61053: variable 'controller_profile' from source: play vars 13355 1727096186.61099: variable 'port2_profile' from source: play vars 13355 1727096186.61135: variable 'port2_profile' from source: play vars 13355 1727096186.61146: variable 'dhcp_interface2' from source: play vars 13355 1727096186.61215: variable 'dhcp_interface2' from source: play vars 13355 1727096186.61226: variable 'controller_profile' from source: play vars 13355 1727096186.61287: variable 'controller_profile' from source: play vars 13355 1727096186.61372: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096186.61375: when evaluation is False, skipping this task 13355 1727096186.61377: _execute() done 13355 1727096186.61379: dumping result to json 13355 1727096186.61381: done dumping result, returning 13355 1727096186.61383: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-0000000000d4] 13355 1727096186.61385: sending task result for task 0afff68d-5257-c514-593f-0000000000d4 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096186.61721: no more pending results, returning what we have 13355 1727096186.61725: results queue empty 13355 1727096186.61725: checking for any_errors_fatal 13355 1727096186.61732: done checking for any_errors_fatal 13355 1727096186.61732: checking for max_fail_percentage 13355 1727096186.61734: done checking for max_fail_percentage 13355 1727096186.61735: checking to see if all hosts have failed and the running result is not ok 13355 1727096186.61736: done checking to see if all hosts have failed 13355 1727096186.61737: getting the remaining hosts for this loop 13355 1727096186.61738: done getting the remaining hosts for this loop 13355 1727096186.61742: getting the next task for host managed_node3 13355 1727096186.61749: done getting next task for host managed_node3 13355 1727096186.61753: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096186.61756: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096186.61779: getting variables 13355 1727096186.61781: in VariableManager get_vars() 13355 1727096186.61835: Calling all_inventory to load vars for managed_node3 13355 1727096186.61838: Calling groups_inventory to load vars for managed_node3 13355 1727096186.61840: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096186.61850: Calling all_plugins_play to load vars for managed_node3 13355 1727096186.61853: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096186.61856: Calling groups_plugins_play to load vars for managed_node3 13355 1727096186.62410: done sending task result for task 0afff68d-5257-c514-593f-0000000000d4 13355 1727096186.62414: WORKER PROCESS EXITING 13355 1727096186.63376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.65169: done with get_vars() 13355 1727096186.65192: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096186.65277: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:56:26 -0400 (0:00:00.113) 0:00:35.913 ****** 13355 1727096186.65309: entering _queue_task() for managed_node3/yum 13355 1727096186.65808: worker is 1 (out of 1 available) 13355 1727096186.65819: exiting _queue_task() for managed_node3/yum 13355 1727096186.65830: done queuing things up, now waiting for results queue to drain 13355 1727096186.65832: waiting for pending results... 13355 1727096186.66032: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096186.66196: in run() - task 0afff68d-5257-c514-593f-0000000000d5 13355 1727096186.66223: variable 'ansible_search_path' from source: unknown 13355 1727096186.66233: variable 'ansible_search_path' from source: unknown 13355 1727096186.66279: calling self._execute() 13355 1727096186.66385: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.66395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.66406: variable 'omit' from source: magic vars 13355 1727096186.66828: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.66847: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096186.67416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096186.69682: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096186.69758: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096186.69805: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096186.69850: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096186.69884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096186.69976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.70029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.70142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.70146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.70148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.70233: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.70261: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13355 1727096186.70270: when evaluation is False, skipping this task 13355 1727096186.70277: _execute() done 13355 1727096186.70283: dumping result to json 13355 1727096186.70290: done dumping result, returning 13355 1727096186.70302: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-0000000000d5] 13355 1727096186.70310: sending task result for task 0afff68d-5257-c514-593f-0000000000d5 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13355 1727096186.70517: no more pending results, returning what we have 13355 1727096186.70521: results queue empty 13355 1727096186.70522: checking for any_errors_fatal 13355 1727096186.70528: done checking for any_errors_fatal 13355 1727096186.70529: checking for max_fail_percentage 13355 1727096186.70531: done checking for max_fail_percentage 13355 1727096186.70532: checking to see if all hosts have failed and the running result is not ok 13355 1727096186.70533: done checking to see if all hosts have failed 13355 1727096186.70533: getting the remaining hosts for this loop 13355 1727096186.70535: done getting the remaining hosts for this loop 13355 1727096186.70539: getting the next task for host managed_node3 13355 1727096186.70546: done getting next task for host managed_node3 13355 1727096186.70550: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096186.70553: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096186.70580: getting variables 13355 1727096186.70582: in VariableManager get_vars() 13355 1727096186.70637: Calling all_inventory to load vars for managed_node3 13355 1727096186.70640: Calling groups_inventory to load vars for managed_node3 13355 1727096186.70642: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096186.70653: Calling all_plugins_play to load vars for managed_node3 13355 1727096186.70656: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096186.70659: Calling groups_plugins_play to load vars for managed_node3 13355 1727096186.71280: done sending task result for task 0afff68d-5257-c514-593f-0000000000d5 13355 1727096186.71283: WORKER PROCESS EXITING 13355 1727096186.73100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.76293: done with get_vars() 13355 1727096186.76322: done getting variables 13355 1727096186.76590: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:56:26 -0400 (0:00:00.113) 0:00:36.026 ****** 13355 1727096186.76628: entering _queue_task() for managed_node3/fail 13355 1727096186.77405: worker is 1 (out of 1 available) 13355 1727096186.77416: exiting _queue_task() for managed_node3/fail 13355 1727096186.77428: done queuing things up, now waiting for results queue to drain 13355 1727096186.77429: waiting for pending results... 13355 1727096186.77674: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096186.77966: in run() - task 0afff68d-5257-c514-593f-0000000000d6 13355 1727096186.77983: variable 'ansible_search_path' from source: unknown 13355 1727096186.77987: variable 'ansible_search_path' from source: unknown 13355 1727096186.78130: calling self._execute() 13355 1727096186.78388: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.78391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.78395: variable 'omit' from source: magic vars 13355 1727096186.79358: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.79384: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096186.79619: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096186.80230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096186.82284: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096186.82331: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096186.82358: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096186.82388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096186.82408: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096186.82473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.82764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.82785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.82813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.82824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.82862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.82881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.82898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.82923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.82934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.82991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.83003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.83020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.83111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.83114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.83373: variable 'network_connections' from source: task vars 13355 1727096186.83377: variable 'controller_profile' from source: play vars 13355 1727096186.83379: variable 'controller_profile' from source: play vars 13355 1727096186.83402: variable 'controller_device' from source: play vars 13355 1727096186.83465: variable 'controller_device' from source: play vars 13355 1727096186.83654: variable 'port1_profile' from source: play vars 13355 1727096186.84161: variable 'port1_profile' from source: play vars 13355 1727096186.84164: variable 'dhcp_interface1' from source: play vars 13355 1727096186.84167: variable 'dhcp_interface1' from source: play vars 13355 1727096186.84171: variable 'controller_profile' from source: play vars 13355 1727096186.84174: variable 'controller_profile' from source: play vars 13355 1727096186.84175: variable 'port2_profile' from source: play vars 13355 1727096186.84177: variable 'port2_profile' from source: play vars 13355 1727096186.84179: variable 'dhcp_interface2' from source: play vars 13355 1727096186.84181: variable 'dhcp_interface2' from source: play vars 13355 1727096186.84183: variable 'controller_profile' from source: play vars 13355 1727096186.84197: variable 'controller_profile' from source: play vars 13355 1727096186.84282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096186.84470: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096186.84513: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096186.84543: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096186.84579: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096186.84626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096186.84649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096186.84699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.84726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096186.84827: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096186.85059: variable 'network_connections' from source: task vars 13355 1727096186.85069: variable 'controller_profile' from source: play vars 13355 1727096186.85109: variable 'controller_profile' from source: play vars 13355 1727096186.85115: variable 'controller_device' from source: play vars 13355 1727096186.85162: variable 'controller_device' from source: play vars 13355 1727096186.85169: variable 'port1_profile' from source: play vars 13355 1727096186.85209: variable 'port1_profile' from source: play vars 13355 1727096186.85215: variable 'dhcp_interface1' from source: play vars 13355 1727096186.85271: variable 'dhcp_interface1' from source: play vars 13355 1727096186.85277: variable 'controller_profile' from source: play vars 13355 1727096186.85317: variable 'controller_profile' from source: play vars 13355 1727096186.85323: variable 'port2_profile' from source: play vars 13355 1727096186.85369: variable 'port2_profile' from source: play vars 13355 1727096186.85375: variable 'dhcp_interface2' from source: play vars 13355 1727096186.85416: variable 'dhcp_interface2' from source: play vars 13355 1727096186.85421: variable 'controller_profile' from source: play vars 13355 1727096186.85466: variable 'controller_profile' from source: play vars 13355 1727096186.85492: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096186.85495: when evaluation is False, skipping this task 13355 1727096186.85497: _execute() done 13355 1727096186.85500: dumping result to json 13355 1727096186.85501: done dumping result, returning 13355 1727096186.85510: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-0000000000d6] 13355 1727096186.85515: sending task result for task 0afff68d-5257-c514-593f-0000000000d6 13355 1727096186.85604: done sending task result for task 0afff68d-5257-c514-593f-0000000000d6 13355 1727096186.85607: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096186.85653: no more pending results, returning what we have 13355 1727096186.85659: results queue empty 13355 1727096186.85660: checking for any_errors_fatal 13355 1727096186.85668: done checking for any_errors_fatal 13355 1727096186.85669: checking for max_fail_percentage 13355 1727096186.85671: done checking for max_fail_percentage 13355 1727096186.85672: checking to see if all hosts have failed and the running result is not ok 13355 1727096186.85673: done checking to see if all hosts have failed 13355 1727096186.85673: getting the remaining hosts for this loop 13355 1727096186.85675: done getting the remaining hosts for this loop 13355 1727096186.85678: getting the next task for host managed_node3 13355 1727096186.85684: done getting next task for host managed_node3 13355 1727096186.85688: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13355 1727096186.85691: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096186.85711: getting variables 13355 1727096186.85712: in VariableManager get_vars() 13355 1727096186.85764: Calling all_inventory to load vars for managed_node3 13355 1727096186.85766: Calling groups_inventory to load vars for managed_node3 13355 1727096186.85776: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096186.85785: Calling all_plugins_play to load vars for managed_node3 13355 1727096186.85787: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096186.85790: Calling groups_plugins_play to load vars for managed_node3 13355 1727096186.86930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096186.88940: done with get_vars() 13355 1727096186.88963: done getting variables 13355 1727096186.89019: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:56:26 -0400 (0:00:00.124) 0:00:36.151 ****** 13355 1727096186.89046: entering _queue_task() for managed_node3/package 13355 1727096186.89307: worker is 1 (out of 1 available) 13355 1727096186.89322: exiting _queue_task() for managed_node3/package 13355 1727096186.89335: done queuing things up, now waiting for results queue to drain 13355 1727096186.89336: waiting for pending results... 13355 1727096186.89524: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13355 1727096186.89623: in run() - task 0afff68d-5257-c514-593f-0000000000d7 13355 1727096186.89634: variable 'ansible_search_path' from source: unknown 13355 1727096186.89637: variable 'ansible_search_path' from source: unknown 13355 1727096186.89684: calling self._execute() 13355 1727096186.89765: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096186.89771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096186.89784: variable 'omit' from source: magic vars 13355 1727096186.90151: variable 'ansible_distribution_major_version' from source: facts 13355 1727096186.90160: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096186.90385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096186.90746: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096186.90863: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096186.90928: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096186.91005: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096186.91254: variable 'network_packages' from source: role '' defaults 13355 1727096186.91282: variable '__network_provider_setup' from source: role '' defaults 13355 1727096186.91296: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096186.91359: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096186.91386: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096186.91449: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096186.91662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096186.93872: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096186.93919: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096186.93949: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096186.93977: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096186.93997: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096186.94056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.94082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.94100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.94127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.94137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.94176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.94193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.94210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.94236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.94246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.94406: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096186.94515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.94525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.94597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.94600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.94603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.94740: variable 'ansible_python' from source: facts 13355 1727096186.94743: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096186.94794: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096186.94875: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096186.95203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.95207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.95210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.95212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.95214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.95216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096186.95225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096186.95227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.95281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096186.95284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096186.95393: variable 'network_connections' from source: task vars 13355 1727096186.95420: variable 'controller_profile' from source: play vars 13355 1727096186.95608: variable 'controller_profile' from source: play vars 13355 1727096186.95612: variable 'controller_device' from source: play vars 13355 1727096186.95614: variable 'controller_device' from source: play vars 13355 1727096186.95617: variable 'port1_profile' from source: play vars 13355 1727096186.95914: variable 'port1_profile' from source: play vars 13355 1727096186.95928: variable 'dhcp_interface1' from source: play vars 13355 1727096186.96048: variable 'dhcp_interface1' from source: play vars 13355 1727096186.96056: variable 'controller_profile' from source: play vars 13355 1727096186.96153: variable 'controller_profile' from source: play vars 13355 1727096186.96165: variable 'port2_profile' from source: play vars 13355 1727096186.96262: variable 'port2_profile' from source: play vars 13355 1727096186.96278: variable 'dhcp_interface2' from source: play vars 13355 1727096186.96379: variable 'dhcp_interface2' from source: play vars 13355 1727096186.96388: variable 'controller_profile' from source: play vars 13355 1727096186.96489: variable 'controller_profile' from source: play vars 13355 1727096186.96573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096186.96605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096186.96634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096186.96657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096186.96702: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096186.96892: variable 'network_connections' from source: task vars 13355 1727096186.96895: variable 'controller_profile' from source: play vars 13355 1727096186.96972: variable 'controller_profile' from source: play vars 13355 1727096186.96975: variable 'controller_device' from source: play vars 13355 1727096186.97044: variable 'controller_device' from source: play vars 13355 1727096186.97053: variable 'port1_profile' from source: play vars 13355 1727096186.97125: variable 'port1_profile' from source: play vars 13355 1727096186.97133: variable 'dhcp_interface1' from source: play vars 13355 1727096186.97205: variable 'dhcp_interface1' from source: play vars 13355 1727096186.97210: variable 'controller_profile' from source: play vars 13355 1727096186.97283: variable 'controller_profile' from source: play vars 13355 1727096186.97291: variable 'port2_profile' from source: play vars 13355 1727096186.97358: variable 'port2_profile' from source: play vars 13355 1727096186.97370: variable 'dhcp_interface2' from source: play vars 13355 1727096186.97438: variable 'dhcp_interface2' from source: play vars 13355 1727096186.97447: variable 'controller_profile' from source: play vars 13355 1727096186.97517: variable 'controller_profile' from source: play vars 13355 1727096186.97559: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096186.97619: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096186.97819: variable 'network_connections' from source: task vars 13355 1727096186.97823: variable 'controller_profile' from source: play vars 13355 1727096186.97874: variable 'controller_profile' from source: play vars 13355 1727096186.97880: variable 'controller_device' from source: play vars 13355 1727096186.97925: variable 'controller_device' from source: play vars 13355 1727096186.97932: variable 'port1_profile' from source: play vars 13355 1727096186.97981: variable 'port1_profile' from source: play vars 13355 1727096186.97987: variable 'dhcp_interface1' from source: play vars 13355 1727096186.98032: variable 'dhcp_interface1' from source: play vars 13355 1727096186.98037: variable 'controller_profile' from source: play vars 13355 1727096186.98086: variable 'controller_profile' from source: play vars 13355 1727096186.98092: variable 'port2_profile' from source: play vars 13355 1727096186.98137: variable 'port2_profile' from source: play vars 13355 1727096186.98142: variable 'dhcp_interface2' from source: play vars 13355 1727096186.98191: variable 'dhcp_interface2' from source: play vars 13355 1727096186.98197: variable 'controller_profile' from source: play vars 13355 1727096186.98243: variable 'controller_profile' from source: play vars 13355 1727096186.98266: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096186.98322: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096186.98520: variable 'network_connections' from source: task vars 13355 1727096186.98524: variable 'controller_profile' from source: play vars 13355 1727096186.98573: variable 'controller_profile' from source: play vars 13355 1727096186.98580: variable 'controller_device' from source: play vars 13355 1727096186.98623: variable 'controller_device' from source: play vars 13355 1727096186.98632: variable 'port1_profile' from source: play vars 13355 1727096186.98681: variable 'port1_profile' from source: play vars 13355 1727096186.98687: variable 'dhcp_interface1' from source: play vars 13355 1727096186.98732: variable 'dhcp_interface1' from source: play vars 13355 1727096186.98735: variable 'controller_profile' from source: play vars 13355 1727096186.98783: variable 'controller_profile' from source: play vars 13355 1727096186.98789: variable 'port2_profile' from source: play vars 13355 1727096186.98836: variable 'port2_profile' from source: play vars 13355 1727096186.98840: variable 'dhcp_interface2' from source: play vars 13355 1727096186.98890: variable 'dhcp_interface2' from source: play vars 13355 1727096186.98898: variable 'controller_profile' from source: play vars 13355 1727096186.98943: variable 'controller_profile' from source: play vars 13355 1727096186.98997: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096186.99039: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096186.99045: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096186.99091: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096186.99231: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096186.99565: variable 'network_connections' from source: task vars 13355 1727096186.99570: variable 'controller_profile' from source: play vars 13355 1727096186.99615: variable 'controller_profile' from source: play vars 13355 1727096186.99623: variable 'controller_device' from source: play vars 13355 1727096186.99666: variable 'controller_device' from source: play vars 13355 1727096186.99675: variable 'port1_profile' from source: play vars 13355 1727096186.99717: variable 'port1_profile' from source: play vars 13355 1727096186.99723: variable 'dhcp_interface1' from source: play vars 13355 1727096186.99769: variable 'dhcp_interface1' from source: play vars 13355 1727096186.99774: variable 'controller_profile' from source: play vars 13355 1727096186.99817: variable 'controller_profile' from source: play vars 13355 1727096186.99823: variable 'port2_profile' from source: play vars 13355 1727096186.99870: variable 'port2_profile' from source: play vars 13355 1727096186.99874: variable 'dhcp_interface2' from source: play vars 13355 1727096186.99920: variable 'dhcp_interface2' from source: play vars 13355 1727096186.99923: variable 'controller_profile' from source: play vars 13355 1727096186.99969: variable 'controller_profile' from source: play vars 13355 1727096186.99976: variable 'ansible_distribution' from source: facts 13355 1727096186.99979: variable '__network_rh_distros' from source: role '' defaults 13355 1727096186.99985: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.00006: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096187.00119: variable 'ansible_distribution' from source: facts 13355 1727096187.00123: variable '__network_rh_distros' from source: role '' defaults 13355 1727096187.00125: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.00138: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096187.00246: variable 'ansible_distribution' from source: facts 13355 1727096187.00251: variable '__network_rh_distros' from source: role '' defaults 13355 1727096187.00254: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.00287: variable 'network_provider' from source: set_fact 13355 1727096187.00298: variable 'ansible_facts' from source: unknown 13355 1727096187.00746: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13355 1727096187.00750: when evaluation is False, skipping this task 13355 1727096187.00753: _execute() done 13355 1727096187.00755: dumping result to json 13355 1727096187.00757: done dumping result, returning 13355 1727096187.00766: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-c514-593f-0000000000d7] 13355 1727096187.00771: sending task result for task 0afff68d-5257-c514-593f-0000000000d7 13355 1727096187.00862: done sending task result for task 0afff68d-5257-c514-593f-0000000000d7 13355 1727096187.00865: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13355 1727096187.00948: no more pending results, returning what we have 13355 1727096187.00951: results queue empty 13355 1727096187.00952: checking for any_errors_fatal 13355 1727096187.00958: done checking for any_errors_fatal 13355 1727096187.00959: checking for max_fail_percentage 13355 1727096187.00960: done checking for max_fail_percentage 13355 1727096187.00961: checking to see if all hosts have failed and the running result is not ok 13355 1727096187.00962: done checking to see if all hosts have failed 13355 1727096187.00963: getting the remaining hosts for this loop 13355 1727096187.00964: done getting the remaining hosts for this loop 13355 1727096187.00970: getting the next task for host managed_node3 13355 1727096187.00977: done getting next task for host managed_node3 13355 1727096187.00982: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096187.00985: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096187.01007: getting variables 13355 1727096187.01009: in VariableManager get_vars() 13355 1727096187.01057: Calling all_inventory to load vars for managed_node3 13355 1727096187.01059: Calling groups_inventory to load vars for managed_node3 13355 1727096187.01061: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096187.01078: Calling all_plugins_play to load vars for managed_node3 13355 1727096187.01081: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096187.01084: Calling groups_plugins_play to load vars for managed_node3 13355 1727096187.01902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096187.02801: done with get_vars() 13355 1727096187.02823: done getting variables 13355 1727096187.02872: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:56:27 -0400 (0:00:00.138) 0:00:36.289 ****** 13355 1727096187.02897: entering _queue_task() for managed_node3/package 13355 1727096187.03163: worker is 1 (out of 1 available) 13355 1727096187.03177: exiting _queue_task() for managed_node3/package 13355 1727096187.03190: done queuing things up, now waiting for results queue to drain 13355 1727096187.03192: waiting for pending results... 13355 1727096187.03377: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096187.03474: in run() - task 0afff68d-5257-c514-593f-0000000000d8 13355 1727096187.03492: variable 'ansible_search_path' from source: unknown 13355 1727096187.03495: variable 'ansible_search_path' from source: unknown 13355 1727096187.03525: calling self._execute() 13355 1727096187.03601: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.03605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.03612: variable 'omit' from source: magic vars 13355 1727096187.03901: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.03911: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096187.03993: variable 'network_state' from source: role '' defaults 13355 1727096187.04002: Evaluated conditional (network_state != {}): False 13355 1727096187.04005: when evaluation is False, skipping this task 13355 1727096187.04007: _execute() done 13355 1727096187.04010: dumping result to json 13355 1727096187.04012: done dumping result, returning 13355 1727096187.04020: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-c514-593f-0000000000d8] 13355 1727096187.04025: sending task result for task 0afff68d-5257-c514-593f-0000000000d8 13355 1727096187.04121: done sending task result for task 0afff68d-5257-c514-593f-0000000000d8 13355 1727096187.04124: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096187.04178: no more pending results, returning what we have 13355 1727096187.04183: results queue empty 13355 1727096187.04183: checking for any_errors_fatal 13355 1727096187.04189: done checking for any_errors_fatal 13355 1727096187.04189: checking for max_fail_percentage 13355 1727096187.04191: done checking for max_fail_percentage 13355 1727096187.04192: checking to see if all hosts have failed and the running result is not ok 13355 1727096187.04193: done checking to see if all hosts have failed 13355 1727096187.04194: getting the remaining hosts for this loop 13355 1727096187.04195: done getting the remaining hosts for this loop 13355 1727096187.04198: getting the next task for host managed_node3 13355 1727096187.04205: done getting next task for host managed_node3 13355 1727096187.04208: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096187.04212: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096187.04237: getting variables 13355 1727096187.04239: in VariableManager get_vars() 13355 1727096187.04290: Calling all_inventory to load vars for managed_node3 13355 1727096187.04292: Calling groups_inventory to load vars for managed_node3 13355 1727096187.04294: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096187.04303: Calling all_plugins_play to load vars for managed_node3 13355 1727096187.04306: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096187.04308: Calling groups_plugins_play to load vars for managed_node3 13355 1727096187.05224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096187.06094: done with get_vars() 13355 1727096187.06117: done getting variables 13355 1727096187.06163: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:56:27 -0400 (0:00:00.032) 0:00:36.322 ****** 13355 1727096187.06189: entering _queue_task() for managed_node3/package 13355 1727096187.06463: worker is 1 (out of 1 available) 13355 1727096187.06478: exiting _queue_task() for managed_node3/package 13355 1727096187.06490: done queuing things up, now waiting for results queue to drain 13355 1727096187.06492: waiting for pending results... 13355 1727096187.06675: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096187.06776: in run() - task 0afff68d-5257-c514-593f-0000000000d9 13355 1727096187.06788: variable 'ansible_search_path' from source: unknown 13355 1727096187.06792: variable 'ansible_search_path' from source: unknown 13355 1727096187.06822: calling self._execute() 13355 1727096187.06898: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.06903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.06910: variable 'omit' from source: magic vars 13355 1727096187.07185: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.07195: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096187.07278: variable 'network_state' from source: role '' defaults 13355 1727096187.07286: Evaluated conditional (network_state != {}): False 13355 1727096187.07289: when evaluation is False, skipping this task 13355 1727096187.07292: _execute() done 13355 1727096187.07295: dumping result to json 13355 1727096187.07297: done dumping result, returning 13355 1727096187.07306: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-c514-593f-0000000000d9] 13355 1727096187.07310: sending task result for task 0afff68d-5257-c514-593f-0000000000d9 13355 1727096187.07409: done sending task result for task 0afff68d-5257-c514-593f-0000000000d9 13355 1727096187.07412: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096187.07460: no more pending results, returning what we have 13355 1727096187.07464: results queue empty 13355 1727096187.07464: checking for any_errors_fatal 13355 1727096187.07475: done checking for any_errors_fatal 13355 1727096187.07475: checking for max_fail_percentage 13355 1727096187.07479: done checking for max_fail_percentage 13355 1727096187.07480: checking to see if all hosts have failed and the running result is not ok 13355 1727096187.07481: done checking to see if all hosts have failed 13355 1727096187.07482: getting the remaining hosts for this loop 13355 1727096187.07483: done getting the remaining hosts for this loop 13355 1727096187.07486: getting the next task for host managed_node3 13355 1727096187.07492: done getting next task for host managed_node3 13355 1727096187.07496: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096187.07499: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096187.07522: getting variables 13355 1727096187.07523: in VariableManager get_vars() 13355 1727096187.07577: Calling all_inventory to load vars for managed_node3 13355 1727096187.07580: Calling groups_inventory to load vars for managed_node3 13355 1727096187.07582: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096187.07592: Calling all_plugins_play to load vars for managed_node3 13355 1727096187.07594: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096187.07597: Calling groups_plugins_play to load vars for managed_node3 13355 1727096187.08380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096187.09368: done with get_vars() 13355 1727096187.09386: done getting variables 13355 1727096187.09431: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:56:27 -0400 (0:00:00.032) 0:00:36.355 ****** 13355 1727096187.09460: entering _queue_task() for managed_node3/service 13355 1727096187.09713: worker is 1 (out of 1 available) 13355 1727096187.09726: exiting _queue_task() for managed_node3/service 13355 1727096187.09739: done queuing things up, now waiting for results queue to drain 13355 1727096187.09741: waiting for pending results... 13355 1727096187.09918: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096187.10011: in run() - task 0afff68d-5257-c514-593f-0000000000da 13355 1727096187.10023: variable 'ansible_search_path' from source: unknown 13355 1727096187.10026: variable 'ansible_search_path' from source: unknown 13355 1727096187.10054: calling self._execute() 13355 1727096187.10130: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.10134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.10143: variable 'omit' from source: magic vars 13355 1727096187.10416: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.10426: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096187.10507: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096187.10645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096187.12136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096187.12186: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096187.12214: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096187.12239: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096187.12261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096187.12328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.12361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.12388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.12414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.12425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.12459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.12485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.12538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.12542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.12553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.12588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.12606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.12622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.12646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.12656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.12778: variable 'network_connections' from source: task vars 13355 1727096187.12789: variable 'controller_profile' from source: play vars 13355 1727096187.12841: variable 'controller_profile' from source: play vars 13355 1727096187.12849: variable 'controller_device' from source: play vars 13355 1727096187.12894: variable 'controller_device' from source: play vars 13355 1727096187.12903: variable 'port1_profile' from source: play vars 13355 1727096187.12946: variable 'port1_profile' from source: play vars 13355 1727096187.12952: variable 'dhcp_interface1' from source: play vars 13355 1727096187.12997: variable 'dhcp_interface1' from source: play vars 13355 1727096187.13003: variable 'controller_profile' from source: play vars 13355 1727096187.13048: variable 'controller_profile' from source: play vars 13355 1727096187.13053: variable 'port2_profile' from source: play vars 13355 1727096187.13100: variable 'port2_profile' from source: play vars 13355 1727096187.13106: variable 'dhcp_interface2' from source: play vars 13355 1727096187.13149: variable 'dhcp_interface2' from source: play vars 13355 1727096187.13155: variable 'controller_profile' from source: play vars 13355 1727096187.13200: variable 'controller_profile' from source: play vars 13355 1727096187.13252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096187.13363: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096187.13393: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096187.13416: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096187.13437: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096187.13475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096187.13491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096187.13509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.13526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096187.13582: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096187.13743: variable 'network_connections' from source: task vars 13355 1727096187.13747: variable 'controller_profile' from source: play vars 13355 1727096187.13795: variable 'controller_profile' from source: play vars 13355 1727096187.13803: variable 'controller_device' from source: play vars 13355 1727096187.13846: variable 'controller_device' from source: play vars 13355 1727096187.13853: variable 'port1_profile' from source: play vars 13355 1727096187.13903: variable 'port1_profile' from source: play vars 13355 1727096187.13906: variable 'dhcp_interface1' from source: play vars 13355 1727096187.13946: variable 'dhcp_interface1' from source: play vars 13355 1727096187.13952: variable 'controller_profile' from source: play vars 13355 1727096187.13996: variable 'controller_profile' from source: play vars 13355 1727096187.14002: variable 'port2_profile' from source: play vars 13355 1727096187.14045: variable 'port2_profile' from source: play vars 13355 1727096187.14056: variable 'dhcp_interface2' from source: play vars 13355 1727096187.14272: variable 'dhcp_interface2' from source: play vars 13355 1727096187.14276: variable 'controller_profile' from source: play vars 13355 1727096187.14278: variable 'controller_profile' from source: play vars 13355 1727096187.14280: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096187.14282: when evaluation is False, skipping this task 13355 1727096187.14284: _execute() done 13355 1727096187.14286: dumping result to json 13355 1727096187.14289: done dumping result, returning 13355 1727096187.14291: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-0000000000da] 13355 1727096187.14293: sending task result for task 0afff68d-5257-c514-593f-0000000000da 13355 1727096187.14374: done sending task result for task 0afff68d-5257-c514-593f-0000000000da 13355 1727096187.14378: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096187.14425: no more pending results, returning what we have 13355 1727096187.14428: results queue empty 13355 1727096187.14429: checking for any_errors_fatal 13355 1727096187.14436: done checking for any_errors_fatal 13355 1727096187.14437: checking for max_fail_percentage 13355 1727096187.14439: done checking for max_fail_percentage 13355 1727096187.14440: checking to see if all hosts have failed and the running result is not ok 13355 1727096187.14440: done checking to see if all hosts have failed 13355 1727096187.14441: getting the remaining hosts for this loop 13355 1727096187.14442: done getting the remaining hosts for this loop 13355 1727096187.14446: getting the next task for host managed_node3 13355 1727096187.14452: done getting next task for host managed_node3 13355 1727096187.14456: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096187.14458: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096187.14505: getting variables 13355 1727096187.14507: in VariableManager get_vars() 13355 1727096187.14560: Calling all_inventory to load vars for managed_node3 13355 1727096187.14563: Calling groups_inventory to load vars for managed_node3 13355 1727096187.14565: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096187.14782: Calling all_plugins_play to load vars for managed_node3 13355 1727096187.14786: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096187.14789: Calling groups_plugins_play to load vars for managed_node3 13355 1727096187.15810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096187.16686: done with get_vars() 13355 1727096187.16707: done getting variables 13355 1727096187.16757: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:56:27 -0400 (0:00:00.073) 0:00:36.428 ****** 13355 1727096187.16784: entering _queue_task() for managed_node3/service 13355 1727096187.17138: worker is 1 (out of 1 available) 13355 1727096187.17152: exiting _queue_task() for managed_node3/service 13355 1727096187.17413: done queuing things up, now waiting for results queue to drain 13355 1727096187.17415: waiting for pending results... 13355 1727096187.17521: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096187.17664: in run() - task 0afff68d-5257-c514-593f-0000000000db 13355 1727096187.17687: variable 'ansible_search_path' from source: unknown 13355 1727096187.17695: variable 'ansible_search_path' from source: unknown 13355 1727096187.17733: calling self._execute() 13355 1727096187.17840: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.17872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.17876: variable 'omit' from source: magic vars 13355 1727096187.18264: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.18290: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096187.18508: variable 'network_provider' from source: set_fact 13355 1727096187.18511: variable 'network_state' from source: role '' defaults 13355 1727096187.18514: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13355 1727096187.18516: variable 'omit' from source: magic vars 13355 1727096187.18552: variable 'omit' from source: magic vars 13355 1727096187.18590: variable 'network_service_name' from source: role '' defaults 13355 1727096187.18660: variable 'network_service_name' from source: role '' defaults 13355 1727096187.18774: variable '__network_provider_setup' from source: role '' defaults 13355 1727096187.18786: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096187.18852: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096187.18945: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096187.18948: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096187.19185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096187.21448: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096187.21518: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096187.21573: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096187.21626: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096187.21665: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096187.21749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.21793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.21825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.21877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.22072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.22076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.22078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.22080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.22082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.22084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.22260: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096187.22387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.22417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.22444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.22488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.22504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.22599: variable 'ansible_python' from source: facts 13355 1727096187.22628: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096187.22719: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096187.22807: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096187.22964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.22970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.22999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.23040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.23060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.23172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.23189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.23192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.23221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.23239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.23384: variable 'network_connections' from source: task vars 13355 1727096187.23396: variable 'controller_profile' from source: play vars 13355 1727096187.23481: variable 'controller_profile' from source: play vars 13355 1727096187.23496: variable 'controller_device' from source: play vars 13355 1727096187.23577: variable 'controller_device' from source: play vars 13355 1727096187.23623: variable 'port1_profile' from source: play vars 13355 1727096187.23676: variable 'port1_profile' from source: play vars 13355 1727096187.23693: variable 'dhcp_interface1' from source: play vars 13355 1727096187.23774: variable 'dhcp_interface1' from source: play vars 13355 1727096187.23789: variable 'controller_profile' from source: play vars 13355 1727096187.23950: variable 'controller_profile' from source: play vars 13355 1727096187.23953: variable 'port2_profile' from source: play vars 13355 1727096187.23975: variable 'port2_profile' from source: play vars 13355 1727096187.23991: variable 'dhcp_interface2' from source: play vars 13355 1727096187.24071: variable 'dhcp_interface2' from source: play vars 13355 1727096187.24086: variable 'controller_profile' from source: play vars 13355 1727096187.24155: variable 'controller_profile' from source: play vars 13355 1727096187.24283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096187.24480: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096187.24539: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096187.24951: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096187.25001: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096187.25079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096187.25112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096187.25151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.25258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096187.25262: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096187.25547: variable 'network_connections' from source: task vars 13355 1727096187.25562: variable 'controller_profile' from source: play vars 13355 1727096187.25644: variable 'controller_profile' from source: play vars 13355 1727096187.25662: variable 'controller_device' from source: play vars 13355 1727096187.25742: variable 'controller_device' from source: play vars 13355 1727096187.25763: variable 'port1_profile' from source: play vars 13355 1727096187.25841: variable 'port1_profile' from source: play vars 13355 1727096187.25861: variable 'dhcp_interface1' from source: play vars 13355 1727096187.25940: variable 'dhcp_interface1' from source: play vars 13355 1727096187.26019: variable 'controller_profile' from source: play vars 13355 1727096187.26035: variable 'controller_profile' from source: play vars 13355 1727096187.26049: variable 'port2_profile' from source: play vars 13355 1727096187.26129: variable 'port2_profile' from source: play vars 13355 1727096187.26144: variable 'dhcp_interface2' from source: play vars 13355 1727096187.26221: variable 'dhcp_interface2' from source: play vars 13355 1727096187.26241: variable 'controller_profile' from source: play vars 13355 1727096187.26317: variable 'controller_profile' from source: play vars 13355 1727096187.26380: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096187.26467: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096187.26771: variable 'network_connections' from source: task vars 13355 1727096187.26785: variable 'controller_profile' from source: play vars 13355 1727096187.26854: variable 'controller_profile' from source: play vars 13355 1727096187.26892: variable 'controller_device' from source: play vars 13355 1727096187.26949: variable 'controller_device' from source: play vars 13355 1727096187.27000: variable 'port1_profile' from source: play vars 13355 1727096187.27042: variable 'port1_profile' from source: play vars 13355 1727096187.27053: variable 'dhcp_interface1' from source: play vars 13355 1727096187.27130: variable 'dhcp_interface1' from source: play vars 13355 1727096187.27140: variable 'controller_profile' from source: play vars 13355 1727096187.27219: variable 'controller_profile' from source: play vars 13355 1727096187.27231: variable 'port2_profile' from source: play vars 13355 1727096187.27327: variable 'port2_profile' from source: play vars 13355 1727096187.27330: variable 'dhcp_interface2' from source: play vars 13355 1727096187.27391: variable 'dhcp_interface2' from source: play vars 13355 1727096187.27403: variable 'controller_profile' from source: play vars 13355 1727096187.27481: variable 'controller_profile' from source: play vars 13355 1727096187.27545: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096187.27599: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096187.27903: variable 'network_connections' from source: task vars 13355 1727096187.27913: variable 'controller_profile' from source: play vars 13355 1727096187.27990: variable 'controller_profile' from source: play vars 13355 1727096187.28001: variable 'controller_device' from source: play vars 13355 1727096187.28090: variable 'controller_device' from source: play vars 13355 1727096187.28093: variable 'port1_profile' from source: play vars 13355 1727096187.28159: variable 'port1_profile' from source: play vars 13355 1727096187.28199: variable 'dhcp_interface1' from source: play vars 13355 1727096187.28250: variable 'dhcp_interface1' from source: play vars 13355 1727096187.28264: variable 'controller_profile' from source: play vars 13355 1727096187.28339: variable 'controller_profile' from source: play vars 13355 1727096187.28574: variable 'port2_profile' from source: play vars 13355 1727096187.28577: variable 'port2_profile' from source: play vars 13355 1727096187.28579: variable 'dhcp_interface2' from source: play vars 13355 1727096187.28581: variable 'dhcp_interface2' from source: play vars 13355 1727096187.28583: variable 'controller_profile' from source: play vars 13355 1727096187.28585: variable 'controller_profile' from source: play vars 13355 1727096187.28647: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096187.28724: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096187.28735: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096187.28799: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096187.29006: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096187.29714: variable 'network_connections' from source: task vars 13355 1727096187.29872: variable 'controller_profile' from source: play vars 13355 1727096187.29875: variable 'controller_profile' from source: play vars 13355 1727096187.29878: variable 'controller_device' from source: play vars 13355 1727096187.30060: variable 'controller_device' from source: play vars 13355 1727096187.30079: variable 'port1_profile' from source: play vars 13355 1727096187.30374: variable 'port1_profile' from source: play vars 13355 1727096187.30377: variable 'dhcp_interface1' from source: play vars 13355 1727096187.30379: variable 'dhcp_interface1' from source: play vars 13355 1727096187.30399: variable 'controller_profile' from source: play vars 13355 1727096187.30462: variable 'controller_profile' from source: play vars 13355 1727096187.30525: variable 'port2_profile' from source: play vars 13355 1727096187.30703: variable 'port2_profile' from source: play vars 13355 1727096187.30715: variable 'dhcp_interface2' from source: play vars 13355 1727096187.30815: variable 'dhcp_interface2' from source: play vars 13355 1727096187.30825: variable 'controller_profile' from source: play vars 13355 1727096187.30890: variable 'controller_profile' from source: play vars 13355 1727096187.31036: variable 'ansible_distribution' from source: facts 13355 1727096187.31045: variable '__network_rh_distros' from source: role '' defaults 13355 1727096187.31054: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.31089: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096187.31422: variable 'ansible_distribution' from source: facts 13355 1727096187.31459: variable '__network_rh_distros' from source: role '' defaults 13355 1727096187.31494: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.31510: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096187.32033: variable 'ansible_distribution' from source: facts 13355 1727096187.32036: variable '__network_rh_distros' from source: role '' defaults 13355 1727096187.32038: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.32040: variable 'network_provider' from source: set_fact 13355 1727096187.32101: variable 'omit' from source: magic vars 13355 1727096187.32150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096187.32343: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096187.32346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096187.32348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096187.32350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096187.32352: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096187.32435: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.32448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.32706: Set connection var ansible_shell_executable to /bin/sh 13355 1727096187.32718: Set connection var ansible_shell_type to sh 13355 1727096187.32727: Set connection var ansible_pipelining to False 13355 1727096187.32735: Set connection var ansible_connection to ssh 13355 1727096187.32782: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096187.32791: Set connection var ansible_timeout to 10 13355 1727096187.32822: variable 'ansible_shell_executable' from source: unknown 13355 1727096187.32990: variable 'ansible_connection' from source: unknown 13355 1727096187.32993: variable 'ansible_module_compression' from source: unknown 13355 1727096187.32995: variable 'ansible_shell_type' from source: unknown 13355 1727096187.32997: variable 'ansible_shell_executable' from source: unknown 13355 1727096187.32999: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.33001: variable 'ansible_pipelining' from source: unknown 13355 1727096187.33003: variable 'ansible_timeout' from source: unknown 13355 1727096187.33005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.33147: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096187.33220: variable 'omit' from source: magic vars 13355 1727096187.33229: starting attempt loop 13355 1727096187.33235: running the handler 13355 1727096187.33427: variable 'ansible_facts' from source: unknown 13355 1727096187.34224: _low_level_execute_command(): starting 13355 1727096187.34237: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096187.35074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096187.35119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096187.35137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096187.35175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096187.35238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096187.36910: stdout chunk (state=3): >>>/root <<< 13355 1727096187.37326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096187.37329: stdout chunk (state=3): >>><<< 13355 1727096187.37331: stderr chunk (state=3): >>><<< 13355 1727096187.37334: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096187.37340: _low_level_execute_command(): starting 13355 1727096187.37343: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554 `" && echo ansible-tmp-1727096187.3730288-14920-52331231670554="` echo /root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554 `" ) && sleep 0' 13355 1727096187.38417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096187.38421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096187.38437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096187.38442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096187.38462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096187.38465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096187.38490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096187.38493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096187.38558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096187.38576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096187.38579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096187.38643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096187.40937: stdout chunk (state=3): >>>ansible-tmp-1727096187.3730288-14920-52331231670554=/root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554 <<< 13355 1727096187.40941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096187.41375: stderr chunk (state=3): >>><<< 13355 1727096187.41379: stdout chunk (state=3): >>><<< 13355 1727096187.41381: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096187.3730288-14920-52331231670554=/root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096187.41383: variable 'ansible_module_compression' from source: unknown 13355 1727096187.41385: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13355 1727096187.41387: variable 'ansible_facts' from source: unknown 13355 1727096187.41512: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/AnsiballZ_systemd.py 13355 1727096187.41801: Sending initial data 13355 1727096187.41811: Sent initial data (155 bytes) 13355 1727096187.42463: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096187.42481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096187.42491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096187.42531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096187.42543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096187.42585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096187.44432: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096187.44466: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096187.44530: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/AnsiballZ_systemd.py" debug1: stat remote: No such file or directory <<< 13355 1727096187.44534: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmptf_92qph /root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/AnsiballZ_systemd.py <<< 13355 1727096187.44542: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmptf_92qph" to remote "/root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/AnsiballZ_systemd.py" <<< 13355 1727096187.46216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096187.46261: stderr chunk (state=3): >>><<< 13355 1727096187.46265: stdout chunk (state=3): >>><<< 13355 1727096187.46275: done transferring module to remote 13355 1727096187.46285: _low_level_execute_command(): starting 13355 1727096187.46289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/ /root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/AnsiballZ_systemd.py && sleep 0' 13355 1727096187.46749: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096187.46752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096187.46754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096187.46757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096187.46759: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096187.46810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096187.46813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096187.46829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096187.46859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096187.48726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096187.48730: stderr chunk (state=3): >>><<< 13355 1727096187.48732: stdout chunk (state=3): >>><<< 13355 1727096187.48777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096187.48784: _low_level_execute_command(): starting 13355 1727096187.48786: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/AnsiballZ_systemd.py && sleep 0' 13355 1727096187.49220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096187.49224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096187.49226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096187.49228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096187.49230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096187.49286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096187.49291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096187.49293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096187.49331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096187.79251: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10489856", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3320127488", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1101529000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 13355 1727096187.79290: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "system<<< 13355 1727096187.79304: stdout chunk (state=3): >>>d-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13355 1727096187.81347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096187.81381: stderr chunk (state=3): >>><<< 13355 1727096187.81384: stdout chunk (state=3): >>><<< 13355 1727096187.81401: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10489856", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3320127488", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1101529000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096187.81515: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096187.81531: _low_level_execute_command(): starting 13355 1727096187.81536: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096187.3730288-14920-52331231670554/ > /dev/null 2>&1 && sleep 0' 13355 1727096187.82070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096187.82075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096187.82090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096187.82104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096187.82195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096187.84063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096187.84097: stderr chunk (state=3): >>><<< 13355 1727096187.84101: stdout chunk (state=3): >>><<< 13355 1727096187.84113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096187.84120: handler run complete 13355 1727096187.84168: attempt loop complete, returning result 13355 1727096187.84172: _execute() done 13355 1727096187.84174: dumping result to json 13355 1727096187.84185: done dumping result, returning 13355 1727096187.84195: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-c514-593f-0000000000db] 13355 1727096187.84199: sending task result for task 0afff68d-5257-c514-593f-0000000000db 13355 1727096187.84397: done sending task result for task 0afff68d-5257-c514-593f-0000000000db 13355 1727096187.84400: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096187.84456: no more pending results, returning what we have 13355 1727096187.84460: results queue empty 13355 1727096187.84461: checking for any_errors_fatal 13355 1727096187.84467: done checking for any_errors_fatal 13355 1727096187.84469: checking for max_fail_percentage 13355 1727096187.84471: done checking for max_fail_percentage 13355 1727096187.84472: checking to see if all hosts have failed and the running result is not ok 13355 1727096187.84472: done checking to see if all hosts have failed 13355 1727096187.84473: getting the remaining hosts for this loop 13355 1727096187.84474: done getting the remaining hosts for this loop 13355 1727096187.84477: getting the next task for host managed_node3 13355 1727096187.84484: done getting next task for host managed_node3 13355 1727096187.84487: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096187.84490: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096187.84501: getting variables 13355 1727096187.84503: in VariableManager get_vars() 13355 1727096187.84587: Calling all_inventory to load vars for managed_node3 13355 1727096187.84590: Calling groups_inventory to load vars for managed_node3 13355 1727096187.84592: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096187.84600: Calling all_plugins_play to load vars for managed_node3 13355 1727096187.84603: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096187.84605: Calling groups_plugins_play to load vars for managed_node3 13355 1727096187.85691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096187.86860: done with get_vars() 13355 1727096187.86891: done getting variables 13355 1727096187.86938: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:56:27 -0400 (0:00:00.701) 0:00:37.130 ****** 13355 1727096187.86963: entering _queue_task() for managed_node3/service 13355 1727096187.87234: worker is 1 (out of 1 available) 13355 1727096187.87248: exiting _queue_task() for managed_node3/service 13355 1727096187.87261: done queuing things up, now waiting for results queue to drain 13355 1727096187.87262: waiting for pending results... 13355 1727096187.87449: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096187.87550: in run() - task 0afff68d-5257-c514-593f-0000000000dc 13355 1727096187.87564: variable 'ansible_search_path' from source: unknown 13355 1727096187.87569: variable 'ansible_search_path' from source: unknown 13355 1727096187.87605: calling self._execute() 13355 1727096187.87677: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.87683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.87692: variable 'omit' from source: magic vars 13355 1727096187.87984: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.87993: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096187.88078: variable 'network_provider' from source: set_fact 13355 1727096187.88082: Evaluated conditional (network_provider == "nm"): True 13355 1727096187.88153: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096187.88216: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096187.88337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096187.89831: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096187.89884: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096187.89916: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096187.89941: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096187.89963: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096187.90039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.90062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.90091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.90113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.90124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.90157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.90178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.90199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.90223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.90234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.90265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096187.90283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096187.90302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.90327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096187.90337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096187.90451: variable 'network_connections' from source: task vars 13355 1727096187.90465: variable 'controller_profile' from source: play vars 13355 1727096187.90520: variable 'controller_profile' from source: play vars 13355 1727096187.90530: variable 'controller_device' from source: play vars 13355 1727096187.90576: variable 'controller_device' from source: play vars 13355 1727096187.90585: variable 'port1_profile' from source: play vars 13355 1727096187.90627: variable 'port1_profile' from source: play vars 13355 1727096187.90637: variable 'dhcp_interface1' from source: play vars 13355 1727096187.90684: variable 'dhcp_interface1' from source: play vars 13355 1727096187.90691: variable 'controller_profile' from source: play vars 13355 1727096187.90732: variable 'controller_profile' from source: play vars 13355 1727096187.90746: variable 'port2_profile' from source: play vars 13355 1727096187.90789: variable 'port2_profile' from source: play vars 13355 1727096187.90795: variable 'dhcp_interface2' from source: play vars 13355 1727096187.90838: variable 'dhcp_interface2' from source: play vars 13355 1727096187.90850: variable 'controller_profile' from source: play vars 13355 1727096187.90894: variable 'controller_profile' from source: play vars 13355 1727096187.90944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096187.91071: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096187.91104: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096187.91126: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096187.91148: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096187.91188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096187.91205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096187.91222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096187.91244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096187.91290: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096187.91459: variable 'network_connections' from source: task vars 13355 1727096187.91463: variable 'controller_profile' from source: play vars 13355 1727096187.91511: variable 'controller_profile' from source: play vars 13355 1727096187.91514: variable 'controller_device' from source: play vars 13355 1727096187.91559: variable 'controller_device' from source: play vars 13355 1727096187.91565: variable 'port1_profile' from source: play vars 13355 1727096187.91612: variable 'port1_profile' from source: play vars 13355 1727096187.91615: variable 'dhcp_interface1' from source: play vars 13355 1727096187.91658: variable 'dhcp_interface1' from source: play vars 13355 1727096187.91661: variable 'controller_profile' from source: play vars 13355 1727096187.91703: variable 'controller_profile' from source: play vars 13355 1727096187.91709: variable 'port2_profile' from source: play vars 13355 1727096187.91753: variable 'port2_profile' from source: play vars 13355 1727096187.91760: variable 'dhcp_interface2' from source: play vars 13355 1727096187.91801: variable 'dhcp_interface2' from source: play vars 13355 1727096187.91807: variable 'controller_profile' from source: play vars 13355 1727096187.91853: variable 'controller_profile' from source: play vars 13355 1727096187.91888: Evaluated conditional (__network_wpa_supplicant_required): False 13355 1727096187.91891: when evaluation is False, skipping this task 13355 1727096187.91894: _execute() done 13355 1727096187.91896: dumping result to json 13355 1727096187.91898: done dumping result, returning 13355 1727096187.91906: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-c514-593f-0000000000dc] 13355 1727096187.91911: sending task result for task 0afff68d-5257-c514-593f-0000000000dc 13355 1727096187.92008: done sending task result for task 0afff68d-5257-c514-593f-0000000000dc 13355 1727096187.92011: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13355 1727096187.92095: no more pending results, returning what we have 13355 1727096187.92099: results queue empty 13355 1727096187.92100: checking for any_errors_fatal 13355 1727096187.92117: done checking for any_errors_fatal 13355 1727096187.92118: checking for max_fail_percentage 13355 1727096187.92120: done checking for max_fail_percentage 13355 1727096187.92121: checking to see if all hosts have failed and the running result is not ok 13355 1727096187.92121: done checking to see if all hosts have failed 13355 1727096187.92122: getting the remaining hosts for this loop 13355 1727096187.92123: done getting the remaining hosts for this loop 13355 1727096187.92127: getting the next task for host managed_node3 13355 1727096187.92133: done getting next task for host managed_node3 13355 1727096187.92136: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096187.92139: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096187.92165: getting variables 13355 1727096187.92169: in VariableManager get_vars() 13355 1727096187.92221: Calling all_inventory to load vars for managed_node3 13355 1727096187.92223: Calling groups_inventory to load vars for managed_node3 13355 1727096187.92225: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096187.92235: Calling all_plugins_play to load vars for managed_node3 13355 1727096187.92237: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096187.92240: Calling groups_plugins_play to load vars for managed_node3 13355 1727096187.93065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096187.93938: done with get_vars() 13355 1727096187.93965: done getting variables 13355 1727096187.94014: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:56:27 -0400 (0:00:00.070) 0:00:37.201 ****** 13355 1727096187.94041: entering _queue_task() for managed_node3/service 13355 1727096187.94320: worker is 1 (out of 1 available) 13355 1727096187.94334: exiting _queue_task() for managed_node3/service 13355 1727096187.94347: done queuing things up, now waiting for results queue to drain 13355 1727096187.94348: waiting for pending results... 13355 1727096187.94534: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096187.94626: in run() - task 0afff68d-5257-c514-593f-0000000000dd 13355 1727096187.94638: variable 'ansible_search_path' from source: unknown 13355 1727096187.94642: variable 'ansible_search_path' from source: unknown 13355 1727096187.94675: calling self._execute() 13355 1727096187.94751: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.94754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.94769: variable 'omit' from source: magic vars 13355 1727096187.95057: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.95070: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096187.95151: variable 'network_provider' from source: set_fact 13355 1727096187.95155: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096187.95161: when evaluation is False, skipping this task 13355 1727096187.95164: _execute() done 13355 1727096187.95169: dumping result to json 13355 1727096187.95172: done dumping result, returning 13355 1727096187.95180: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-c514-593f-0000000000dd] 13355 1727096187.95185: sending task result for task 0afff68d-5257-c514-593f-0000000000dd 13355 1727096187.95271: done sending task result for task 0afff68d-5257-c514-593f-0000000000dd 13355 1727096187.95274: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096187.95317: no more pending results, returning what we have 13355 1727096187.95321: results queue empty 13355 1727096187.95321: checking for any_errors_fatal 13355 1727096187.95329: done checking for any_errors_fatal 13355 1727096187.95330: checking for max_fail_percentage 13355 1727096187.95332: done checking for max_fail_percentage 13355 1727096187.95333: checking to see if all hosts have failed and the running result is not ok 13355 1727096187.95333: done checking to see if all hosts have failed 13355 1727096187.95334: getting the remaining hosts for this loop 13355 1727096187.95335: done getting the remaining hosts for this loop 13355 1727096187.95338: getting the next task for host managed_node3 13355 1727096187.95345: done getting next task for host managed_node3 13355 1727096187.95349: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096187.95352: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096187.95377: getting variables 13355 1727096187.95378: in VariableManager get_vars() 13355 1727096187.95429: Calling all_inventory to load vars for managed_node3 13355 1727096187.95432: Calling groups_inventory to load vars for managed_node3 13355 1727096187.95435: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096187.95444: Calling all_plugins_play to load vars for managed_node3 13355 1727096187.95447: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096187.95449: Calling groups_plugins_play to load vars for managed_node3 13355 1727096187.96378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096187.97242: done with get_vars() 13355 1727096187.97269: done getting variables 13355 1727096187.97315: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:56:27 -0400 (0:00:00.033) 0:00:37.234 ****** 13355 1727096187.97343: entering _queue_task() for managed_node3/copy 13355 1727096187.97620: worker is 1 (out of 1 available) 13355 1727096187.97633: exiting _queue_task() for managed_node3/copy 13355 1727096187.97645: done queuing things up, now waiting for results queue to drain 13355 1727096187.97646: waiting for pending results... 13355 1727096187.97837: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096187.97935: in run() - task 0afff68d-5257-c514-593f-0000000000de 13355 1727096187.97947: variable 'ansible_search_path' from source: unknown 13355 1727096187.97951: variable 'ansible_search_path' from source: unknown 13355 1727096187.97987: calling self._execute() 13355 1727096187.98063: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096187.98069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096187.98078: variable 'omit' from source: magic vars 13355 1727096187.98359: variable 'ansible_distribution_major_version' from source: facts 13355 1727096187.98372: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096187.98453: variable 'network_provider' from source: set_fact 13355 1727096187.98457: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096187.98463: when evaluation is False, skipping this task 13355 1727096187.98466: _execute() done 13355 1727096187.98470: dumping result to json 13355 1727096187.98475: done dumping result, returning 13355 1727096187.98484: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-c514-593f-0000000000de] 13355 1727096187.98488: sending task result for task 0afff68d-5257-c514-593f-0000000000de 13355 1727096187.98583: done sending task result for task 0afff68d-5257-c514-593f-0000000000de 13355 1727096187.98586: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096187.98633: no more pending results, returning what we have 13355 1727096187.98637: results queue empty 13355 1727096187.98638: checking for any_errors_fatal 13355 1727096187.98644: done checking for any_errors_fatal 13355 1727096187.98645: checking for max_fail_percentage 13355 1727096187.98647: done checking for max_fail_percentage 13355 1727096187.98648: checking to see if all hosts have failed and the running result is not ok 13355 1727096187.98648: done checking to see if all hosts have failed 13355 1727096187.98649: getting the remaining hosts for this loop 13355 1727096187.98650: done getting the remaining hosts for this loop 13355 1727096187.98653: getting the next task for host managed_node3 13355 1727096187.98660: done getting next task for host managed_node3 13355 1727096187.98664: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096187.98667: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096187.98691: getting variables 13355 1727096187.98692: in VariableManager get_vars() 13355 1727096187.98742: Calling all_inventory to load vars for managed_node3 13355 1727096187.98744: Calling groups_inventory to load vars for managed_node3 13355 1727096187.98746: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096187.98755: Calling all_plugins_play to load vars for managed_node3 13355 1727096187.98758: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096187.98761: Calling groups_plugins_play to load vars for managed_node3 13355 1727096187.99570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096188.00549: done with get_vars() 13355 1727096188.00571: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:56:28 -0400 (0:00:00.032) 0:00:37.266 ****** 13355 1727096188.00637: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096188.00905: worker is 1 (out of 1 available) 13355 1727096188.00919: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096188.00932: done queuing things up, now waiting for results queue to drain 13355 1727096188.00933: waiting for pending results... 13355 1727096188.01124: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096188.01218: in run() - task 0afff68d-5257-c514-593f-0000000000df 13355 1727096188.01230: variable 'ansible_search_path' from source: unknown 13355 1727096188.01234: variable 'ansible_search_path' from source: unknown 13355 1727096188.01276: calling self._execute() 13355 1727096188.01340: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096188.01343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096188.01352: variable 'omit' from source: magic vars 13355 1727096188.01640: variable 'ansible_distribution_major_version' from source: facts 13355 1727096188.01651: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096188.01656: variable 'omit' from source: magic vars 13355 1727096188.01704: variable 'omit' from source: magic vars 13355 1727096188.01820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096188.08116: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096188.08171: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096188.08198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096188.08220: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096188.08240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096188.08301: variable 'network_provider' from source: set_fact 13355 1727096188.08393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096188.08412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096188.08430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096188.08455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096188.08477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096188.08527: variable 'omit' from source: magic vars 13355 1727096188.08606: variable 'omit' from source: magic vars 13355 1727096188.08675: variable 'network_connections' from source: task vars 13355 1727096188.08688: variable 'controller_profile' from source: play vars 13355 1727096188.08726: variable 'controller_profile' from source: play vars 13355 1727096188.08732: variable 'controller_device' from source: play vars 13355 1727096188.08774: variable 'controller_device' from source: play vars 13355 1727096188.08783: variable 'port1_profile' from source: play vars 13355 1727096188.08825: variable 'port1_profile' from source: play vars 13355 1727096188.08831: variable 'dhcp_interface1' from source: play vars 13355 1727096188.08873: variable 'dhcp_interface1' from source: play vars 13355 1727096188.08879: variable 'controller_profile' from source: play vars 13355 1727096188.08922: variable 'controller_profile' from source: play vars 13355 1727096188.08928: variable 'port2_profile' from source: play vars 13355 1727096188.08970: variable 'port2_profile' from source: play vars 13355 1727096188.08976: variable 'dhcp_interface2' from source: play vars 13355 1727096188.09019: variable 'dhcp_interface2' from source: play vars 13355 1727096188.09024: variable 'controller_profile' from source: play vars 13355 1727096188.09066: variable 'controller_profile' from source: play vars 13355 1727096188.09176: variable 'omit' from source: magic vars 13355 1727096188.09183: variable '__lsr_ansible_managed' from source: task vars 13355 1727096188.09229: variable '__lsr_ansible_managed' from source: task vars 13355 1727096188.09341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13355 1727096188.09471: Loaded config def from plugin (lookup/template) 13355 1727096188.09474: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13355 1727096188.09493: File lookup term: get_ansible_managed.j2 13355 1727096188.09496: variable 'ansible_search_path' from source: unknown 13355 1727096188.09499: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13355 1727096188.09509: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13355 1727096188.09522: variable 'ansible_search_path' from source: unknown 13355 1727096188.12707: variable 'ansible_managed' from source: unknown 13355 1727096188.12785: variable 'omit' from source: magic vars 13355 1727096188.12806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096188.12825: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096188.12837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096188.12848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096188.12858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096188.12873: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096188.12876: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096188.12880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096188.12942: Set connection var ansible_shell_executable to /bin/sh 13355 1727096188.12945: Set connection var ansible_shell_type to sh 13355 1727096188.12950: Set connection var ansible_pipelining to False 13355 1727096188.12955: Set connection var ansible_connection to ssh 13355 1727096188.12960: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096188.12972: Set connection var ansible_timeout to 10 13355 1727096188.12987: variable 'ansible_shell_executable' from source: unknown 13355 1727096188.12990: variable 'ansible_connection' from source: unknown 13355 1727096188.12992: variable 'ansible_module_compression' from source: unknown 13355 1727096188.12995: variable 'ansible_shell_type' from source: unknown 13355 1727096188.12997: variable 'ansible_shell_executable' from source: unknown 13355 1727096188.12999: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096188.13003: variable 'ansible_pipelining' from source: unknown 13355 1727096188.13005: variable 'ansible_timeout' from source: unknown 13355 1727096188.13010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096188.13100: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096188.13107: variable 'omit' from source: magic vars 13355 1727096188.13113: starting attempt loop 13355 1727096188.13116: running the handler 13355 1727096188.13125: _low_level_execute_command(): starting 13355 1727096188.13129: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096188.13623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096188.13627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096188.13629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096188.13631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096188.13633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096188.13691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096188.13694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096188.13696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096188.13741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096188.15409: stdout chunk (state=3): >>>/root <<< 13355 1727096188.15504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096188.15538: stderr chunk (state=3): >>><<< 13355 1727096188.15542: stdout chunk (state=3): >>><<< 13355 1727096188.15559: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096188.15573: _low_level_execute_command(): starting 13355 1727096188.15579: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849 `" && echo ansible-tmp-1727096188.1556146-14962-246396532518849="` echo /root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849 `" ) && sleep 0' 13355 1727096188.16023: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096188.16027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096188.16029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096188.16031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096188.16033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096188.16035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096188.16088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096188.16099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096188.16105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096188.16130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096188.18048: stdout chunk (state=3): >>>ansible-tmp-1727096188.1556146-14962-246396532518849=/root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849 <<< 13355 1727096188.18158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096188.18186: stderr chunk (state=3): >>><<< 13355 1727096188.18189: stdout chunk (state=3): >>><<< 13355 1727096188.18204: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096188.1556146-14962-246396532518849=/root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096188.18247: variable 'ansible_module_compression' from source: unknown 13355 1727096188.18281: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13355 1727096188.18305: variable 'ansible_facts' from source: unknown 13355 1727096188.18377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/AnsiballZ_network_connections.py 13355 1727096188.18474: Sending initial data 13355 1727096188.18477: Sent initial data (168 bytes) 13355 1727096188.18976: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096188.18979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096188.18981: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096188.18983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096188.18985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096188.18987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096188.19041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096188.19046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096188.19049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096188.19079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096188.20674: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096188.20694: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096188.20748: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp_r9n6gma /root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/AnsiballZ_network_connections.py <<< 13355 1727096188.20752: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/AnsiballZ_network_connections.py" <<< 13355 1727096188.20819: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp_r9n6gma" to remote "/root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/AnsiballZ_network_connections.py" <<< 13355 1727096188.21817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096188.21877: stderr chunk (state=3): >>><<< 13355 1727096188.21880: stdout chunk (state=3): >>><<< 13355 1727096188.21882: done transferring module to remote 13355 1727096188.21899: _low_level_execute_command(): starting 13355 1727096188.21977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/ /root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/AnsiballZ_network_connections.py && sleep 0' 13355 1727096188.22433: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096188.22465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096188.22482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096188.22518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096188.22531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096188.22575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096188.24765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096188.24771: stdout chunk (state=3): >>><<< 13355 1727096188.24774: stderr chunk (state=3): >>><<< 13355 1727096188.24776: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096188.24778: _low_level_execute_command(): starting 13355 1727096188.24780: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/AnsiballZ_network_connections.py && sleep 0' 13355 1727096188.25565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096188.25584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096188.25603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096188.25628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096188.25644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096188.25657: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096188.25731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096188.25777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096188.25802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096188.25816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096188.25897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096188.79104: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13355 1727096188.81135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096188.81225: stderr chunk (state=3): >>>Shared connection to 10.31.14.152 closed. <<< 13355 1727096188.81319: stderr chunk (state=3): >>><<< 13355 1727096188.81323: stdout chunk (state=3): >>><<< 13355 1727096188.81766: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096188.81773: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096188.81780: _low_level_execute_command(): starting 13355 1727096188.81782: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096188.1556146-14962-246396532518849/ > /dev/null 2>&1 && sleep 0' 13355 1727096188.82800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096188.82819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096188.82834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096188.82852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096188.82871: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096188.82885: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096188.82980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096188.83205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096188.83240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096188.85216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096188.85220: stdout chunk (state=3): >>><<< 13355 1727096188.85223: stderr chunk (state=3): >>><<< 13355 1727096188.85325: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096188.85328: handler run complete 13355 1727096188.85637: attempt loop complete, returning result 13355 1727096188.85641: _execute() done 13355 1727096188.85643: dumping result to json 13355 1727096188.85645: done dumping result, returning 13355 1727096188.85647: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-c514-593f-0000000000df] 13355 1727096188.85649: sending task result for task 0afff68d-5257-c514-593f-0000000000df changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7 [008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 [009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active) 13355 1727096188.85904: no more pending results, returning what we have 13355 1727096188.85908: results queue empty 13355 1727096188.85908: checking for any_errors_fatal 13355 1727096188.85914: done checking for any_errors_fatal 13355 1727096188.85915: checking for max_fail_percentage 13355 1727096188.85916: done checking for max_fail_percentage 13355 1727096188.85917: checking to see if all hosts have failed and the running result is not ok 13355 1727096188.85918: done checking to see if all hosts have failed 13355 1727096188.85919: getting the remaining hosts for this loop 13355 1727096188.85920: done getting the remaining hosts for this loop 13355 1727096188.85923: getting the next task for host managed_node3 13355 1727096188.85930: done getting next task for host managed_node3 13355 1727096188.85933: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096188.85936: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096188.85948: getting variables 13355 1727096188.85949: in VariableManager get_vars() 13355 1727096188.86214: Calling all_inventory to load vars for managed_node3 13355 1727096188.86217: Calling groups_inventory to load vars for managed_node3 13355 1727096188.86219: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096188.86280: done sending task result for task 0afff68d-5257-c514-593f-0000000000df 13355 1727096188.86283: WORKER PROCESS EXITING 13355 1727096188.86292: Calling all_plugins_play to load vars for managed_node3 13355 1727096188.86295: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096188.86298: Calling groups_plugins_play to load vars for managed_node3 13355 1727096188.95952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096188.97863: done with get_vars() 13355 1727096188.98107: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:56:28 -0400 (0:00:00.976) 0:00:38.243 ****** 13355 1727096188.98291: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096188.99046: worker is 1 (out of 1 available) 13355 1727096188.99175: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096188.99193: done queuing things up, now waiting for results queue to drain 13355 1727096188.99196: waiting for pending results... 13355 1727096188.99515: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096188.99904: in run() - task 0afff68d-5257-c514-593f-0000000000e0 13355 1727096189.00077: variable 'ansible_search_path' from source: unknown 13355 1727096189.00080: variable 'ansible_search_path' from source: unknown 13355 1727096189.00115: calling self._execute() 13355 1727096189.00320: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.00376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.00392: variable 'omit' from source: magic vars 13355 1727096189.01408: variable 'ansible_distribution_major_version' from source: facts 13355 1727096189.01493: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096189.01784: variable 'network_state' from source: role '' defaults 13355 1727096189.01789: Evaluated conditional (network_state != {}): False 13355 1727096189.01791: when evaluation is False, skipping this task 13355 1727096189.01793: _execute() done 13355 1727096189.01795: dumping result to json 13355 1727096189.01797: done dumping result, returning 13355 1727096189.01800: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-c514-593f-0000000000e0] 13355 1727096189.01802: sending task result for task 0afff68d-5257-c514-593f-0000000000e0 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096189.02054: no more pending results, returning what we have 13355 1727096189.02061: results queue empty 13355 1727096189.02063: checking for any_errors_fatal 13355 1727096189.02083: done checking for any_errors_fatal 13355 1727096189.02084: checking for max_fail_percentage 13355 1727096189.02086: done checking for max_fail_percentage 13355 1727096189.02087: checking to see if all hosts have failed and the running result is not ok 13355 1727096189.02088: done checking to see if all hosts have failed 13355 1727096189.02089: getting the remaining hosts for this loop 13355 1727096189.02090: done getting the remaining hosts for this loop 13355 1727096189.02094: getting the next task for host managed_node3 13355 1727096189.02107: done getting next task for host managed_node3 13355 1727096189.02112: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096189.02115: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096189.02140: getting variables 13355 1727096189.02142: in VariableManager get_vars() 13355 1727096189.02254: Calling all_inventory to load vars for managed_node3 13355 1727096189.02260: Calling groups_inventory to load vars for managed_node3 13355 1727096189.02264: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096189.02278: Calling all_plugins_play to load vars for managed_node3 13355 1727096189.02282: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096189.02285: Calling groups_plugins_play to load vars for managed_node3 13355 1727096189.03247: done sending task result for task 0afff68d-5257-c514-593f-0000000000e0 13355 1727096189.03251: WORKER PROCESS EXITING 13355 1727096189.04706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096189.06348: done with get_vars() 13355 1727096189.06385: done getting variables 13355 1727096189.06445: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:56:29 -0400 (0:00:00.082) 0:00:38.325 ****** 13355 1727096189.06490: entering _queue_task() for managed_node3/debug 13355 1727096189.06999: worker is 1 (out of 1 available) 13355 1727096189.07010: exiting _queue_task() for managed_node3/debug 13355 1727096189.07022: done queuing things up, now waiting for results queue to drain 13355 1727096189.07023: waiting for pending results... 13355 1727096189.07209: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096189.07359: in run() - task 0afff68d-5257-c514-593f-0000000000e1 13355 1727096189.07388: variable 'ansible_search_path' from source: unknown 13355 1727096189.07396: variable 'ansible_search_path' from source: unknown 13355 1727096189.07438: calling self._execute() 13355 1727096189.07555: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.07575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.07596: variable 'omit' from source: magic vars 13355 1727096189.08185: variable 'ansible_distribution_major_version' from source: facts 13355 1727096189.08188: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096189.08191: variable 'omit' from source: magic vars 13355 1727096189.08193: variable 'omit' from source: magic vars 13355 1727096189.08195: variable 'omit' from source: magic vars 13355 1727096189.08197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096189.08227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096189.08258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096189.08301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096189.08317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096189.08352: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096189.08364: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.08373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.08480: Set connection var ansible_shell_executable to /bin/sh 13355 1727096189.08492: Set connection var ansible_shell_type to sh 13355 1727096189.08501: Set connection var ansible_pipelining to False 13355 1727096189.08518: Set connection var ansible_connection to ssh 13355 1727096189.08527: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096189.08536: Set connection var ansible_timeout to 10 13355 1727096189.08569: variable 'ansible_shell_executable' from source: unknown 13355 1727096189.08595: variable 'ansible_connection' from source: unknown 13355 1727096189.08602: variable 'ansible_module_compression' from source: unknown 13355 1727096189.08609: variable 'ansible_shell_type' from source: unknown 13355 1727096189.08624: variable 'ansible_shell_executable' from source: unknown 13355 1727096189.08729: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.08733: variable 'ansible_pipelining' from source: unknown 13355 1727096189.08735: variable 'ansible_timeout' from source: unknown 13355 1727096189.08737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.08803: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096189.08821: variable 'omit' from source: magic vars 13355 1727096189.08837: starting attempt loop 13355 1727096189.08844: running the handler 13355 1727096189.08993: variable '__network_connections_result' from source: set_fact 13355 1727096189.09081: handler run complete 13355 1727096189.09106: attempt loop complete, returning result 13355 1727096189.09113: _execute() done 13355 1727096189.09119: dumping result to json 13355 1727096189.09127: done dumping result, returning 13355 1727096189.09141: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-c514-593f-0000000000e1] 13355 1727096189.09150: sending task result for task 0afff68d-5257-c514-593f-0000000000e1 ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)" ] } 13355 1727096189.09344: no more pending results, returning what we have 13355 1727096189.09347: results queue empty 13355 1727096189.09348: checking for any_errors_fatal 13355 1727096189.09353: done checking for any_errors_fatal 13355 1727096189.09354: checking for max_fail_percentage 13355 1727096189.09359: done checking for max_fail_percentage 13355 1727096189.09360: checking to see if all hosts have failed and the running result is not ok 13355 1727096189.09361: done checking to see if all hosts have failed 13355 1727096189.09361: getting the remaining hosts for this loop 13355 1727096189.09363: done getting the remaining hosts for this loop 13355 1727096189.09366: getting the next task for host managed_node3 13355 1727096189.09376: done getting next task for host managed_node3 13355 1727096189.09491: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096189.09494: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096189.09509: getting variables 13355 1727096189.09511: in VariableManager get_vars() 13355 1727096189.09674: Calling all_inventory to load vars for managed_node3 13355 1727096189.09679: Calling groups_inventory to load vars for managed_node3 13355 1727096189.09682: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096189.09693: Calling all_plugins_play to load vars for managed_node3 13355 1727096189.09696: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096189.09699: Calling groups_plugins_play to load vars for managed_node3 13355 1727096189.10386: done sending task result for task 0afff68d-5257-c514-593f-0000000000e1 13355 1727096189.10389: WORKER PROCESS EXITING 13355 1727096189.12545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096189.15135: done with get_vars() 13355 1727096189.15176: done getting variables 13355 1727096189.15236: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:56:29 -0400 (0:00:00.087) 0:00:38.413 ****** 13355 1727096189.15286: entering _queue_task() for managed_node3/debug 13355 1727096189.15660: worker is 1 (out of 1 available) 13355 1727096189.15674: exiting _queue_task() for managed_node3/debug 13355 1727096189.15687: done queuing things up, now waiting for results queue to drain 13355 1727096189.15688: waiting for pending results... 13355 1727096189.16003: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096189.16174: in run() - task 0afff68d-5257-c514-593f-0000000000e2 13355 1727096189.16196: variable 'ansible_search_path' from source: unknown 13355 1727096189.16203: variable 'ansible_search_path' from source: unknown 13355 1727096189.16249: calling self._execute() 13355 1727096189.16366: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.16380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.16393: variable 'omit' from source: magic vars 13355 1727096189.16798: variable 'ansible_distribution_major_version' from source: facts 13355 1727096189.16815: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096189.16825: variable 'omit' from source: magic vars 13355 1727096189.16896: variable 'omit' from source: magic vars 13355 1727096189.16938: variable 'omit' from source: magic vars 13355 1727096189.16988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096189.17035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096189.17062: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096189.17086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096189.17102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096189.17143: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096189.17220: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.17223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.17272: Set connection var ansible_shell_executable to /bin/sh 13355 1727096189.17283: Set connection var ansible_shell_type to sh 13355 1727096189.17292: Set connection var ansible_pipelining to False 13355 1727096189.17300: Set connection var ansible_connection to ssh 13355 1727096189.17309: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096189.17317: Set connection var ansible_timeout to 10 13355 1727096189.17349: variable 'ansible_shell_executable' from source: unknown 13355 1727096189.17359: variable 'ansible_connection' from source: unknown 13355 1727096189.17367: variable 'ansible_module_compression' from source: unknown 13355 1727096189.17375: variable 'ansible_shell_type' from source: unknown 13355 1727096189.17381: variable 'ansible_shell_executable' from source: unknown 13355 1727096189.17387: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.17394: variable 'ansible_pipelining' from source: unknown 13355 1727096189.17399: variable 'ansible_timeout' from source: unknown 13355 1727096189.17406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.17569: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096189.17658: variable 'omit' from source: magic vars 13355 1727096189.17661: starting attempt loop 13355 1727096189.17664: running the handler 13355 1727096189.17666: variable '__network_connections_result' from source: set_fact 13355 1727096189.17737: variable '__network_connections_result' from source: set_fact 13355 1727096189.17926: handler run complete 13355 1727096189.17963: attempt loop complete, returning result 13355 1727096189.17972: _execute() done 13355 1727096189.17984: dumping result to json 13355 1727096189.17993: done dumping result, returning 13355 1727096189.18006: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-c514-593f-0000000000e2] 13355 1727096189.18016: sending task result for task 0afff68d-5257-c514-593f-0000000000e2 ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7\n[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980\n[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7", "[008] #1, state:up persistent_state:present, 'bond0.0': update connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980", "[009] #2, state:up persistent_state:present, 'bond0.1': update connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a55cc535-e38f-4547-bb0f-3479e284a0c7 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0dd2062c-1ff8-43a7-a41c-6a1fd34b6980 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, c28ac129-a5cf-428d-a75a-c74a7d1cb1ab (not-active)" ] } } 13355 1727096189.18335: no more pending results, returning what we have 13355 1727096189.18339: results queue empty 13355 1727096189.18340: checking for any_errors_fatal 13355 1727096189.18348: done checking for any_errors_fatal 13355 1727096189.18348: checking for max_fail_percentage 13355 1727096189.18360: done checking for max_fail_percentage 13355 1727096189.18361: checking to see if all hosts have failed and the running result is not ok 13355 1727096189.18362: done checking to see if all hosts have failed 13355 1727096189.18362: getting the remaining hosts for this loop 13355 1727096189.18364: done getting the remaining hosts for this loop 13355 1727096189.18369: getting the next task for host managed_node3 13355 1727096189.18376: done getting next task for host managed_node3 13355 1727096189.18380: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096189.18383: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096189.18396: getting variables 13355 1727096189.18397: in VariableManager get_vars() 13355 1727096189.18569: Calling all_inventory to load vars for managed_node3 13355 1727096189.18572: Calling groups_inventory to load vars for managed_node3 13355 1727096189.18575: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096189.18587: Calling all_plugins_play to load vars for managed_node3 13355 1727096189.18590: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096189.18593: Calling groups_plugins_play to load vars for managed_node3 13355 1727096189.19389: done sending task result for task 0afff68d-5257-c514-593f-0000000000e2 13355 1727096189.19392: WORKER PROCESS EXITING 13355 1727096189.20236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096189.21869: done with get_vars() 13355 1727096189.21898: done getting variables 13355 1727096189.21969: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:56:29 -0400 (0:00:00.067) 0:00:38.480 ****** 13355 1727096189.22003: entering _queue_task() for managed_node3/debug 13355 1727096189.22395: worker is 1 (out of 1 available) 13355 1727096189.22408: exiting _queue_task() for managed_node3/debug 13355 1727096189.22420: done queuing things up, now waiting for results queue to drain 13355 1727096189.22421: waiting for pending results... 13355 1727096189.22709: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096189.22852: in run() - task 0afff68d-5257-c514-593f-0000000000e3 13355 1727096189.22879: variable 'ansible_search_path' from source: unknown 13355 1727096189.22886: variable 'ansible_search_path' from source: unknown 13355 1727096189.22932: calling self._execute() 13355 1727096189.23049: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.23065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.23081: variable 'omit' from source: magic vars 13355 1727096189.23504: variable 'ansible_distribution_major_version' from source: facts 13355 1727096189.23522: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096189.23651: variable 'network_state' from source: role '' defaults 13355 1727096189.23680: Evaluated conditional (network_state != {}): False 13355 1727096189.23688: when evaluation is False, skipping this task 13355 1727096189.23695: _execute() done 13355 1727096189.23704: dumping result to json 13355 1727096189.23712: done dumping result, returning 13355 1727096189.23724: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-c514-593f-0000000000e3] 13355 1727096189.23733: sending task result for task 0afff68d-5257-c514-593f-0000000000e3 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13355 1727096189.23928: no more pending results, returning what we have 13355 1727096189.23931: results queue empty 13355 1727096189.23932: checking for any_errors_fatal 13355 1727096189.23941: done checking for any_errors_fatal 13355 1727096189.23942: checking for max_fail_percentage 13355 1727096189.23944: done checking for max_fail_percentage 13355 1727096189.23945: checking to see if all hosts have failed and the running result is not ok 13355 1727096189.23946: done checking to see if all hosts have failed 13355 1727096189.23946: getting the remaining hosts for this loop 13355 1727096189.23948: done getting the remaining hosts for this loop 13355 1727096189.23952: getting the next task for host managed_node3 13355 1727096189.23962: done getting next task for host managed_node3 13355 1727096189.23966: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096189.23971: ^ state is: HOST STATE: block=2, task=27, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096189.23997: getting variables 13355 1727096189.23999: in VariableManager get_vars() 13355 1727096189.24058: Calling all_inventory to load vars for managed_node3 13355 1727096189.24061: Calling groups_inventory to load vars for managed_node3 13355 1727096189.24064: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096189.24293: Calling all_plugins_play to load vars for managed_node3 13355 1727096189.24297: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096189.24302: Calling groups_plugins_play to load vars for managed_node3 13355 1727096189.24907: done sending task result for task 0afff68d-5257-c514-593f-0000000000e3 13355 1727096189.24911: WORKER PROCESS EXITING 13355 1727096189.25859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096189.27525: done with get_vars() 13355 1727096189.27562: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:56:29 -0400 (0:00:00.056) 0:00:38.537 ****** 13355 1727096189.27670: entering _queue_task() for managed_node3/ping 13355 1727096189.28212: worker is 1 (out of 1 available) 13355 1727096189.28221: exiting _queue_task() for managed_node3/ping 13355 1727096189.28233: done queuing things up, now waiting for results queue to drain 13355 1727096189.28235: waiting for pending results... 13355 1727096189.28433: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096189.28624: in run() - task 0afff68d-5257-c514-593f-0000000000e4 13355 1727096189.28650: variable 'ansible_search_path' from source: unknown 13355 1727096189.28666: variable 'ansible_search_path' from source: unknown 13355 1727096189.28713: calling self._execute() 13355 1727096189.28835: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.28846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.28875: variable 'omit' from source: magic vars 13355 1727096189.29374: variable 'ansible_distribution_major_version' from source: facts 13355 1727096189.29380: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096189.29392: variable 'omit' from source: magic vars 13355 1727096189.29510: variable 'omit' from source: magic vars 13355 1727096189.29539: variable 'omit' from source: magic vars 13355 1727096189.29593: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096189.29638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096189.29749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096189.29753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096189.29761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096189.29764: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096189.29768: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.30077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.30372: Set connection var ansible_shell_executable to /bin/sh 13355 1727096189.30376: Set connection var ansible_shell_type to sh 13355 1727096189.30378: Set connection var ansible_pipelining to False 13355 1727096189.30380: Set connection var ansible_connection to ssh 13355 1727096189.30382: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096189.30384: Set connection var ansible_timeout to 10 13355 1727096189.30386: variable 'ansible_shell_executable' from source: unknown 13355 1727096189.30388: variable 'ansible_connection' from source: unknown 13355 1727096189.30390: variable 'ansible_module_compression' from source: unknown 13355 1727096189.30392: variable 'ansible_shell_type' from source: unknown 13355 1727096189.30394: variable 'ansible_shell_executable' from source: unknown 13355 1727096189.30396: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.30398: variable 'ansible_pipelining' from source: unknown 13355 1727096189.30400: variable 'ansible_timeout' from source: unknown 13355 1727096189.30402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.30769: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096189.30865: variable 'omit' from source: magic vars 13355 1727096189.30879: starting attempt loop 13355 1727096189.30886: running the handler 13355 1727096189.30909: _low_level_execute_command(): starting 13355 1727096189.30927: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096189.32320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096189.32388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096189.32395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096189.32398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096189.32401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096189.32403: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096189.32576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096189.32580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096189.32583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096189.32606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096189.32678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096189.34398: stdout chunk (state=3): >>>/root <<< 13355 1727096189.34536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096189.34598: stdout chunk (state=3): >>><<< 13355 1727096189.34602: stderr chunk (state=3): >>><<< 13355 1727096189.34784: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096189.34788: _low_level_execute_command(): starting 13355 1727096189.34791: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618 `" && echo ansible-tmp-1727096189.3462481-15026-107167750702618="` echo /root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618 `" ) && sleep 0' 13355 1727096189.36170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096189.36176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13355 1727096189.36189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096189.36197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096189.36395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096189.36399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096189.36477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096189.38486: stdout chunk (state=3): >>>ansible-tmp-1727096189.3462481-15026-107167750702618=/root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618 <<< 13355 1727096189.38617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096189.38758: stderr chunk (state=3): >>><<< 13355 1727096189.38762: stdout chunk (state=3): >>><<< 13355 1727096189.38789: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096189.3462481-15026-107167750702618=/root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096189.38988: variable 'ansible_module_compression' from source: unknown 13355 1727096189.39099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13355 1727096189.39508: variable 'ansible_facts' from source: unknown 13355 1727096189.39513: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/AnsiballZ_ping.py 13355 1727096189.40073: Sending initial data 13355 1727096189.40085: Sent initial data (153 bytes) 13355 1727096189.41101: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096189.41116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096189.41133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096189.41152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096189.41176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096189.41190: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096189.41205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096189.41286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096189.41315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096189.41340: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096189.41387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096189.41415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096189.43186: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096189.43351: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/AnsiballZ_ping.py" <<< 13355 1727096189.43378: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpron_etzr /root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/AnsiballZ_ping.py <<< 13355 1727096189.43475: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpron_etzr" to remote "/root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/AnsiballZ_ping.py" <<< 13355 1727096189.44479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096189.44506: stderr chunk (state=3): >>><<< 13355 1727096189.44510: stdout chunk (state=3): >>><<< 13355 1727096189.44532: done transferring module to remote 13355 1727096189.44556: _low_level_execute_command(): starting 13355 1727096189.44560: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/ /root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/AnsiballZ_ping.py && sleep 0' 13355 1727096189.45811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096189.45862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096189.45866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096189.45871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096189.46079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096189.46107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096189.46211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096189.47998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096189.48053: stderr chunk (state=3): >>><<< 13355 1727096189.48057: stdout chunk (state=3): >>><<< 13355 1727096189.48290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096189.48293: _low_level_execute_command(): starting 13355 1727096189.48298: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/AnsiballZ_ping.py && sleep 0' 13355 1727096189.49312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096189.49641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096189.49787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096189.49857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096189.65401: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13355 1727096189.66940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096189.66945: stdout chunk (state=3): >>><<< 13355 1727096189.66974: stderr chunk (state=3): >>><<< 13355 1727096189.66978: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096189.66997: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096189.67007: _low_level_execute_command(): starting 13355 1727096189.67173: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096189.3462481-15026-107167750702618/ > /dev/null 2>&1 && sleep 0' 13355 1727096189.68391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096189.68427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096189.68431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096189.68433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096189.68495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096189.70577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096189.70583: stdout chunk (state=3): >>><<< 13355 1727096189.70586: stderr chunk (state=3): >>><<< 13355 1727096189.70589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096189.70602: handler run complete 13355 1727096189.70604: attempt loop complete, returning result 13355 1727096189.70606: _execute() done 13355 1727096189.70608: dumping result to json 13355 1727096189.70610: done dumping result, returning 13355 1727096189.70612: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-c514-593f-0000000000e4] 13355 1727096189.70614: sending task result for task 0afff68d-5257-c514-593f-0000000000e4 13355 1727096189.70686: done sending task result for task 0afff68d-5257-c514-593f-0000000000e4 13355 1727096189.70690: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 13355 1727096189.70771: no more pending results, returning what we have 13355 1727096189.70775: results queue empty 13355 1727096189.70776: checking for any_errors_fatal 13355 1727096189.70782: done checking for any_errors_fatal 13355 1727096189.70782: checking for max_fail_percentage 13355 1727096189.70784: done checking for max_fail_percentage 13355 1727096189.70785: checking to see if all hosts have failed and the running result is not ok 13355 1727096189.70786: done checking to see if all hosts have failed 13355 1727096189.70787: getting the remaining hosts for this loop 13355 1727096189.70788: done getting the remaining hosts for this loop 13355 1727096189.70791: getting the next task for host managed_node3 13355 1727096189.70802: done getting next task for host managed_node3 13355 1727096189.70804: ^ task is: TASK: meta (role_complete) 13355 1727096189.70806: ^ state is: HOST STATE: block=2, task=28, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096189.70818: getting variables 13355 1727096189.70820: in VariableManager get_vars() 13355 1727096189.71080: Calling all_inventory to load vars for managed_node3 13355 1727096189.71084: Calling groups_inventory to load vars for managed_node3 13355 1727096189.71087: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096189.71097: Calling all_plugins_play to load vars for managed_node3 13355 1727096189.71101: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096189.71104: Calling groups_plugins_play to load vars for managed_node3 13355 1727096189.74250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096189.77706: done with get_vars() 13355 1727096189.77740: done getting variables 13355 1727096189.77934: done queuing things up, now waiting for results queue to drain 13355 1727096189.77937: results queue empty 13355 1727096189.77938: checking for any_errors_fatal 13355 1727096189.77941: done checking for any_errors_fatal 13355 1727096189.77941: checking for max_fail_percentage 13355 1727096189.77942: done checking for max_fail_percentage 13355 1727096189.77943: checking to see if all hosts have failed and the running result is not ok 13355 1727096189.77944: done checking to see if all hosts have failed 13355 1727096189.77944: getting the remaining hosts for this loop 13355 1727096189.77945: done getting the remaining hosts for this loop 13355 1727096189.77948: getting the next task for host managed_node3 13355 1727096189.77953: done getting next task for host managed_node3 13355 1727096189.77956: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096189.77958: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096189.77970: getting variables 13355 1727096189.77971: in VariableManager get_vars() 13355 1727096189.78193: Calling all_inventory to load vars for managed_node3 13355 1727096189.78196: Calling groups_inventory to load vars for managed_node3 13355 1727096189.78198: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096189.78203: Calling all_plugins_play to load vars for managed_node3 13355 1727096189.78206: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096189.78208: Calling groups_plugins_play to load vars for managed_node3 13355 1727096189.80508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096189.84248: done with get_vars() 13355 1727096189.84396: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:56:29 -0400 (0:00:00.569) 0:00:39.106 ****** 13355 1727096189.84581: entering _queue_task() for managed_node3/include_tasks 13355 1727096189.85389: worker is 1 (out of 1 available) 13355 1727096189.85402: exiting _queue_task() for managed_node3/include_tasks 13355 1727096189.85413: done queuing things up, now waiting for results queue to drain 13355 1727096189.85415: waiting for pending results... 13355 1727096189.86197: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096189.86202: in run() - task 0afff68d-5257-c514-593f-00000000011b 13355 1727096189.86205: variable 'ansible_search_path' from source: unknown 13355 1727096189.86207: variable 'ansible_search_path' from source: unknown 13355 1727096189.86210: calling self._execute() 13355 1727096189.86496: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096189.86513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096189.86526: variable 'omit' from source: magic vars 13355 1727096189.87324: variable 'ansible_distribution_major_version' from source: facts 13355 1727096189.87599: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096189.87604: _execute() done 13355 1727096189.87607: dumping result to json 13355 1727096189.87610: done dumping result, returning 13355 1727096189.87612: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-c514-593f-00000000011b] 13355 1727096189.87614: sending task result for task 0afff68d-5257-c514-593f-00000000011b 13355 1727096189.87694: done sending task result for task 0afff68d-5257-c514-593f-00000000011b 13355 1727096189.87697: WORKER PROCESS EXITING 13355 1727096189.87752: no more pending results, returning what we have 13355 1727096189.87759: in VariableManager get_vars() 13355 1727096189.87820: Calling all_inventory to load vars for managed_node3 13355 1727096189.87822: Calling groups_inventory to load vars for managed_node3 13355 1727096189.87825: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096189.87837: Calling all_plugins_play to load vars for managed_node3 13355 1727096189.87841: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096189.87843: Calling groups_plugins_play to load vars for managed_node3 13355 1727096189.90798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096189.94216: done with get_vars() 13355 1727096189.94364: variable 'ansible_search_path' from source: unknown 13355 1727096189.94366: variable 'ansible_search_path' from source: unknown 13355 1727096189.94411: we have included files to process 13355 1727096189.94412: generating all_blocks data 13355 1727096189.94414: done generating all_blocks data 13355 1727096189.94420: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096189.94421: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096189.94423: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096189.95692: done processing included file 13355 1727096189.95695: iterating over new_blocks loaded from include file 13355 1727096189.95697: in VariableManager get_vars() 13355 1727096189.95735: done with get_vars() 13355 1727096189.95737: filtering new block on tags 13355 1727096189.95873: done filtering new block on tags 13355 1727096189.95876: in VariableManager get_vars() 13355 1727096189.95910: done with get_vars() 13355 1727096189.95912: filtering new block on tags 13355 1727096189.95933: done filtering new block on tags 13355 1727096189.95936: in VariableManager get_vars() 13355 1727096189.96081: done with get_vars() 13355 1727096189.96083: filtering new block on tags 13355 1727096189.96101: done filtering new block on tags 13355 1727096189.96103: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13355 1727096189.96109: extending task lists for all hosts with included blocks 13355 1727096189.97306: done extending task lists 13355 1727096189.97309: done processing included files 13355 1727096189.97309: results queue empty 13355 1727096189.97310: checking for any_errors_fatal 13355 1727096189.97312: done checking for any_errors_fatal 13355 1727096189.97312: checking for max_fail_percentage 13355 1727096189.97314: done checking for max_fail_percentage 13355 1727096189.97315: checking to see if all hosts have failed and the running result is not ok 13355 1727096189.97316: done checking to see if all hosts have failed 13355 1727096189.97316: getting the remaining hosts for this loop 13355 1727096189.97318: done getting the remaining hosts for this loop 13355 1727096189.97320: getting the next task for host managed_node3 13355 1727096189.97324: done getting next task for host managed_node3 13355 1727096189.97327: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096189.97330: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096189.97342: getting variables 13355 1727096189.97343: in VariableManager get_vars() 13355 1727096189.97371: Calling all_inventory to load vars for managed_node3 13355 1727096189.97374: Calling groups_inventory to load vars for managed_node3 13355 1727096189.97380: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096189.97387: Calling all_plugins_play to load vars for managed_node3 13355 1727096189.97390: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096189.97393: Calling groups_plugins_play to load vars for managed_node3 13355 1727096189.99064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096190.00887: done with get_vars() 13355 1727096190.00919: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:56:30 -0400 (0:00:00.164) 0:00:39.270 ****** 13355 1727096190.01012: entering _queue_task() for managed_node3/setup 13355 1727096190.01575: worker is 1 (out of 1 available) 13355 1727096190.01585: exiting _queue_task() for managed_node3/setup 13355 1727096190.01596: done queuing things up, now waiting for results queue to drain 13355 1727096190.01597: waiting for pending results... 13355 1727096190.01703: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096190.01885: in run() - task 0afff68d-5257-c514-593f-00000000084f 13355 1727096190.01907: variable 'ansible_search_path' from source: unknown 13355 1727096190.01914: variable 'ansible_search_path' from source: unknown 13355 1727096190.01958: calling self._execute() 13355 1727096190.02077: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096190.02088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096190.02106: variable 'omit' from source: magic vars 13355 1727096190.02501: variable 'ansible_distribution_major_version' from source: facts 13355 1727096190.02533: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096190.02755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096190.05178: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096190.05197: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096190.05244: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096190.05285: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096190.05321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096190.05411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096190.05446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096190.05480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096190.05624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096190.05627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096190.05629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096190.05632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096190.05757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096190.05846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096190.05870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096190.06056: variable '__network_required_facts' from source: role '' defaults 13355 1727096190.06077: variable 'ansible_facts' from source: unknown 13355 1727096190.06842: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13355 1727096190.06852: when evaluation is False, skipping this task 13355 1727096190.06873: _execute() done 13355 1727096190.06876: dumping result to json 13355 1727096190.06879: done dumping result, returning 13355 1727096190.06927: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-c514-593f-00000000084f] 13355 1727096190.06930: sending task result for task 0afff68d-5257-c514-593f-00000000084f skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096190.07180: no more pending results, returning what we have 13355 1727096190.07185: results queue empty 13355 1727096190.07186: checking for any_errors_fatal 13355 1727096190.07187: done checking for any_errors_fatal 13355 1727096190.07188: checking for max_fail_percentage 13355 1727096190.07190: done checking for max_fail_percentage 13355 1727096190.07191: checking to see if all hosts have failed and the running result is not ok 13355 1727096190.07191: done checking to see if all hosts have failed 13355 1727096190.07192: getting the remaining hosts for this loop 13355 1727096190.07194: done getting the remaining hosts for this loop 13355 1727096190.07198: getting the next task for host managed_node3 13355 1727096190.07208: done getting next task for host managed_node3 13355 1727096190.07212: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096190.07217: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096190.07237: getting variables 13355 1727096190.07239: in VariableManager get_vars() 13355 1727096190.07300: Calling all_inventory to load vars for managed_node3 13355 1727096190.07304: Calling groups_inventory to load vars for managed_node3 13355 1727096190.07306: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096190.07316: Calling all_plugins_play to load vars for managed_node3 13355 1727096190.07319: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096190.07323: Calling groups_plugins_play to load vars for managed_node3 13355 1727096190.07893: done sending task result for task 0afff68d-5257-c514-593f-00000000084f 13355 1727096190.07896: WORKER PROCESS EXITING 13355 1727096190.08954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096190.12279: done with get_vars() 13355 1727096190.12311: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:56:30 -0400 (0:00:00.116) 0:00:39.386 ****** 13355 1727096190.12644: entering _queue_task() for managed_node3/stat 13355 1727096190.13430: worker is 1 (out of 1 available) 13355 1727096190.13445: exiting _queue_task() for managed_node3/stat 13355 1727096190.13501: done queuing things up, now waiting for results queue to drain 13355 1727096190.13509: waiting for pending results... 13355 1727096190.13803: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096190.13960: in run() - task 0afff68d-5257-c514-593f-000000000851 13355 1727096190.13981: variable 'ansible_search_path' from source: unknown 13355 1727096190.13984: variable 'ansible_search_path' from source: unknown 13355 1727096190.14022: calling self._execute() 13355 1727096190.14134: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096190.14138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096190.14147: variable 'omit' from source: magic vars 13355 1727096190.14544: variable 'ansible_distribution_major_version' from source: facts 13355 1727096190.14555: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096190.14725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096190.15015: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096190.15072: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096190.15103: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096190.15136: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096190.15226: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096190.15250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096190.15280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096190.15310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096190.15414: variable '__network_is_ostree' from source: set_fact 13355 1727096190.15420: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096190.15423: when evaluation is False, skipping this task 13355 1727096190.15425: _execute() done 13355 1727096190.15428: dumping result to json 13355 1727096190.15434: done dumping result, returning 13355 1727096190.15444: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-c514-593f-000000000851] 13355 1727096190.15499: sending task result for task 0afff68d-5257-c514-593f-000000000851 13355 1727096190.15773: done sending task result for task 0afff68d-5257-c514-593f-000000000851 13355 1727096190.15776: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096190.15888: no more pending results, returning what we have 13355 1727096190.15892: results queue empty 13355 1727096190.15893: checking for any_errors_fatal 13355 1727096190.15898: done checking for any_errors_fatal 13355 1727096190.15899: checking for max_fail_percentage 13355 1727096190.15901: done checking for max_fail_percentage 13355 1727096190.15902: checking to see if all hosts have failed and the running result is not ok 13355 1727096190.15903: done checking to see if all hosts have failed 13355 1727096190.15903: getting the remaining hosts for this loop 13355 1727096190.15905: done getting the remaining hosts for this loop 13355 1727096190.15909: getting the next task for host managed_node3 13355 1727096190.15915: done getting next task for host managed_node3 13355 1727096190.15920: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096190.15924: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096190.15942: getting variables 13355 1727096190.15944: in VariableManager get_vars() 13355 1727096190.15997: Calling all_inventory to load vars for managed_node3 13355 1727096190.16001: Calling groups_inventory to load vars for managed_node3 13355 1727096190.16003: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096190.16013: Calling all_plugins_play to load vars for managed_node3 13355 1727096190.16015: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096190.16018: Calling groups_plugins_play to load vars for managed_node3 13355 1727096190.17908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096190.19959: done with get_vars() 13355 1727096190.20000: done getting variables 13355 1727096190.20091: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:56:30 -0400 (0:00:00.074) 0:00:39.461 ****** 13355 1727096190.20136: entering _queue_task() for managed_node3/set_fact 13355 1727096190.20738: worker is 1 (out of 1 available) 13355 1727096190.20750: exiting _queue_task() for managed_node3/set_fact 13355 1727096190.20764: done queuing things up, now waiting for results queue to drain 13355 1727096190.20768: waiting for pending results... 13355 1727096190.21096: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096190.21323: in run() - task 0afff68d-5257-c514-593f-000000000852 13355 1727096190.21358: variable 'ansible_search_path' from source: unknown 13355 1727096190.21362: variable 'ansible_search_path' from source: unknown 13355 1727096190.21437: calling self._execute() 13355 1727096190.21680: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096190.21692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096190.21724: variable 'omit' from source: magic vars 13355 1727096190.22464: variable 'ansible_distribution_major_version' from source: facts 13355 1727096190.22525: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096190.22841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096190.23330: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096190.23399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096190.23436: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096190.23525: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096190.23638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096190.23666: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096190.23694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096190.23749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096190.23949: variable '__network_is_ostree' from source: set_fact 13355 1727096190.23953: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096190.23955: when evaluation is False, skipping this task 13355 1727096190.23957: _execute() done 13355 1727096190.23959: dumping result to json 13355 1727096190.23961: done dumping result, returning 13355 1727096190.23964: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-c514-593f-000000000852] 13355 1727096190.23966: sending task result for task 0afff68d-5257-c514-593f-000000000852 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096190.24125: no more pending results, returning what we have 13355 1727096190.24129: results queue empty 13355 1727096190.24130: checking for any_errors_fatal 13355 1727096190.24251: done checking for any_errors_fatal 13355 1727096190.24253: checking for max_fail_percentage 13355 1727096190.24255: done checking for max_fail_percentage 13355 1727096190.24258: checking to see if all hosts have failed and the running result is not ok 13355 1727096190.24259: done checking to see if all hosts have failed 13355 1727096190.24260: getting the remaining hosts for this loop 13355 1727096190.24262: done getting the remaining hosts for this loop 13355 1727096190.24265: getting the next task for host managed_node3 13355 1727096190.24278: done getting next task for host managed_node3 13355 1727096190.24282: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096190.24287: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096190.24315: getting variables 13355 1727096190.24317: in VariableManager get_vars() 13355 1727096190.24496: Calling all_inventory to load vars for managed_node3 13355 1727096190.24499: Calling groups_inventory to load vars for managed_node3 13355 1727096190.24502: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096190.24512: Calling all_plugins_play to load vars for managed_node3 13355 1727096190.24515: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096190.24519: Calling groups_plugins_play to load vars for managed_node3 13355 1727096190.25038: done sending task result for task 0afff68d-5257-c514-593f-000000000852 13355 1727096190.25043: WORKER PROCESS EXITING 13355 1727096190.26150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096190.28396: done with get_vars() 13355 1727096190.28461: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:56:30 -0400 (0:00:00.084) 0:00:39.546 ****** 13355 1727096190.28582: entering _queue_task() for managed_node3/service_facts 13355 1727096190.29289: worker is 1 (out of 1 available) 13355 1727096190.29300: exiting _queue_task() for managed_node3/service_facts 13355 1727096190.29312: done queuing things up, now waiting for results queue to drain 13355 1727096190.29313: waiting for pending results... 13355 1727096190.29787: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096190.30084: in run() - task 0afff68d-5257-c514-593f-000000000854 13355 1727096190.30089: variable 'ansible_search_path' from source: unknown 13355 1727096190.30092: variable 'ansible_search_path' from source: unknown 13355 1727096190.30112: calling self._execute() 13355 1727096190.30283: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096190.30286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096190.30473: variable 'omit' from source: magic vars 13355 1727096190.30748: variable 'ansible_distribution_major_version' from source: facts 13355 1727096190.30765: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096190.30773: variable 'omit' from source: magic vars 13355 1727096190.30858: variable 'omit' from source: magic vars 13355 1727096190.30898: variable 'omit' from source: magic vars 13355 1727096190.30944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096190.30985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096190.31005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096190.31027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096190.31034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096190.31139: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096190.31143: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096190.31145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096190.31183: Set connection var ansible_shell_executable to /bin/sh 13355 1727096190.31189: Set connection var ansible_shell_type to sh 13355 1727096190.31195: Set connection var ansible_pipelining to False 13355 1727096190.31200: Set connection var ansible_connection to ssh 13355 1727096190.31205: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096190.31246: Set connection var ansible_timeout to 10 13355 1727096190.31249: variable 'ansible_shell_executable' from source: unknown 13355 1727096190.31252: variable 'ansible_connection' from source: unknown 13355 1727096190.31263: variable 'ansible_module_compression' from source: unknown 13355 1727096190.31265: variable 'ansible_shell_type' from source: unknown 13355 1727096190.31269: variable 'ansible_shell_executable' from source: unknown 13355 1727096190.31271: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096190.31273: variable 'ansible_pipelining' from source: unknown 13355 1727096190.31275: variable 'ansible_timeout' from source: unknown 13355 1727096190.31277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096190.31575: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096190.31585: variable 'omit' from source: magic vars 13355 1727096190.31593: starting attempt loop 13355 1727096190.31595: running the handler 13355 1727096190.31601: _low_level_execute_command(): starting 13355 1727096190.31607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096190.32619: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096190.32676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096190.32715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096190.32719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096190.32799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096190.34724: stdout chunk (state=3): >>>/root <<< 13355 1727096190.34728: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096190.34731: stderr chunk (state=3): >>><<< 13355 1727096190.34733: stdout chunk (state=3): >>><<< 13355 1727096190.34761: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096190.34786: _low_level_execute_command(): starting 13355 1727096190.34873: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126 `" && echo ansible-tmp-1727096190.34771-15060-112432448839126="` echo /root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126 `" ) && sleep 0' 13355 1727096190.35546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096190.35565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096190.35601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096190.35634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096190.35775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096190.35818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096190.35889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096190.37942: stdout chunk (state=3): >>>ansible-tmp-1727096190.34771-15060-112432448839126=/root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126 <<< 13355 1727096190.38114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096190.38192: stdout chunk (state=3): >>><<< 13355 1727096190.38196: stderr chunk (state=3): >>><<< 13355 1727096190.38305: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096190.34771-15060-112432448839126=/root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096190.38514: variable 'ansible_module_compression' from source: unknown 13355 1727096190.38552: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13355 1727096190.38614: variable 'ansible_facts' from source: unknown 13355 1727096190.38747: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/AnsiballZ_service_facts.py 13355 1727096190.39037: Sending initial data 13355 1727096190.39040: Sent initial data (160 bytes) 13355 1727096190.40274: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096190.40294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096190.40386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096190.40499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096190.40558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096190.40639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096190.42340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096190.42407: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096190.42412: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp755a3w0f /root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/AnsiballZ_service_facts.py <<< 13355 1727096190.42508: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/AnsiballZ_service_facts.py" <<< 13355 1727096190.42512: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp755a3w0f" to remote "/root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/AnsiballZ_service_facts.py" <<< 13355 1727096190.43777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096190.43781: stdout chunk (state=3): >>><<< 13355 1727096190.43783: stderr chunk (state=3): >>><<< 13355 1727096190.43785: done transferring module to remote 13355 1727096190.43787: _low_level_execute_command(): starting 13355 1727096190.43789: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/ /root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/AnsiballZ_service_facts.py && sleep 0' 13355 1727096190.44559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096190.44578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096190.44590: stderr chunk (state=3): >>>debug2: match found <<< 13355 1727096190.44664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096190.44799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096190.44860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096190.44913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096190.46881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096190.46884: stdout chunk (state=3): >>><<< 13355 1727096190.46886: stderr chunk (state=3): >>><<< 13355 1727096190.46891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096190.46899: _low_level_execute_command(): starting 13355 1727096190.46902: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/AnsiballZ_service_facts.py && sleep 0' 13355 1727096190.47916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096190.47954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096190.48071: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096190.48106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096190.48135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096190.48258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096192.13788: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13355 1727096192.15704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096192.15842: stderr chunk (state=3): >>><<< 13355 1727096192.15846: stdout chunk (state=3): >>><<< 13355 1727096192.15852: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096192.18329: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096192.18340: _low_level_execute_command(): starting 13355 1727096192.18697: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096190.34771-15060-112432448839126/ > /dev/null 2>&1 && sleep 0' 13355 1727096192.20098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096192.20103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096192.20250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096192.20278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096192.20472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096192.22414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096192.22418: stdout chunk (state=3): >>><<< 13355 1727096192.22423: stderr chunk (state=3): >>><<< 13355 1727096192.22440: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096192.22447: handler run complete 13355 1727096192.23034: variable 'ansible_facts' from source: unknown 13355 1727096192.23489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096192.24727: variable 'ansible_facts' from source: unknown 13355 1727096192.25082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096192.25457: attempt loop complete, returning result 13355 1727096192.25465: _execute() done 13355 1727096192.25470: dumping result to json 13355 1727096192.25640: done dumping result, returning 13355 1727096192.25651: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-c514-593f-000000000854] 13355 1727096192.25656: sending task result for task 0afff68d-5257-c514-593f-000000000854 13355 1727096192.27443: done sending task result for task 0afff68d-5257-c514-593f-000000000854 13355 1727096192.27446: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096192.27576: no more pending results, returning what we have 13355 1727096192.27579: results queue empty 13355 1727096192.27580: checking for any_errors_fatal 13355 1727096192.27584: done checking for any_errors_fatal 13355 1727096192.27584: checking for max_fail_percentage 13355 1727096192.27586: done checking for max_fail_percentage 13355 1727096192.27587: checking to see if all hosts have failed and the running result is not ok 13355 1727096192.27588: done checking to see if all hosts have failed 13355 1727096192.27588: getting the remaining hosts for this loop 13355 1727096192.27589: done getting the remaining hosts for this loop 13355 1727096192.27592: getting the next task for host managed_node3 13355 1727096192.27682: done getting next task for host managed_node3 13355 1727096192.27688: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096192.27693: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096192.27756: getting variables 13355 1727096192.27758: in VariableManager get_vars() 13355 1727096192.27807: Calling all_inventory to load vars for managed_node3 13355 1727096192.27810: Calling groups_inventory to load vars for managed_node3 13355 1727096192.27812: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096192.27828: Calling all_plugins_play to load vars for managed_node3 13355 1727096192.27830: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096192.27834: Calling groups_plugins_play to load vars for managed_node3 13355 1727096192.29480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096192.31290: done with get_vars() 13355 1727096192.31330: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:56:32 -0400 (0:00:02.028) 0:00:41.575 ****** 13355 1727096192.31447: entering _queue_task() for managed_node3/package_facts 13355 1727096192.32085: worker is 1 (out of 1 available) 13355 1727096192.32096: exiting _queue_task() for managed_node3/package_facts 13355 1727096192.32107: done queuing things up, now waiting for results queue to drain 13355 1727096192.32109: waiting for pending results... 13355 1727096192.32236: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096192.32492: in run() - task 0afff68d-5257-c514-593f-000000000855 13355 1727096192.32501: variable 'ansible_search_path' from source: unknown 13355 1727096192.32505: variable 'ansible_search_path' from source: unknown 13355 1727096192.32586: calling self._execute() 13355 1727096192.32818: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096192.32825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096192.32832: variable 'omit' from source: magic vars 13355 1727096192.33357: variable 'ansible_distribution_major_version' from source: facts 13355 1727096192.33361: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096192.33364: variable 'omit' from source: magic vars 13355 1727096192.33482: variable 'omit' from source: magic vars 13355 1727096192.33485: variable 'omit' from source: magic vars 13355 1727096192.33488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096192.33532: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096192.33561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096192.33579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096192.33591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096192.33704: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096192.33708: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096192.33710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096192.33735: Set connection var ansible_shell_executable to /bin/sh 13355 1727096192.33742: Set connection var ansible_shell_type to sh 13355 1727096192.33747: Set connection var ansible_pipelining to False 13355 1727096192.33753: Set connection var ansible_connection to ssh 13355 1727096192.33761: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096192.34075: Set connection var ansible_timeout to 10 13355 1727096192.34078: variable 'ansible_shell_executable' from source: unknown 13355 1727096192.34081: variable 'ansible_connection' from source: unknown 13355 1727096192.34084: variable 'ansible_module_compression' from source: unknown 13355 1727096192.34086: variable 'ansible_shell_type' from source: unknown 13355 1727096192.34088: variable 'ansible_shell_executable' from source: unknown 13355 1727096192.34090: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096192.34092: variable 'ansible_pipelining' from source: unknown 13355 1727096192.34094: variable 'ansible_timeout' from source: unknown 13355 1727096192.34096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096192.34108: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096192.34112: variable 'omit' from source: magic vars 13355 1727096192.34115: starting attempt loop 13355 1727096192.34117: running the handler 13355 1727096192.34119: _low_level_execute_command(): starting 13355 1727096192.34121: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096192.35461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096192.35795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096192.36049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096192.36128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096192.37777: stdout chunk (state=3): >>>/root <<< 13355 1727096192.37875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096192.37966: stderr chunk (state=3): >>><<< 13355 1727096192.37974: stdout chunk (state=3): >>><<< 13355 1727096192.37993: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096192.38093: _low_level_execute_command(): starting 13355 1727096192.38097: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397 `" && echo ansible-tmp-1727096192.379995-15139-140750238259397="` echo /root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397 `" ) && sleep 0' 13355 1727096192.38672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096192.38713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096192.38815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096192.38819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096192.38933: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096192.39111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096192.39147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096192.41097: stdout chunk (state=3): >>>ansible-tmp-1727096192.379995-15139-140750238259397=/root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397 <<< 13355 1727096192.41284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096192.41289: stdout chunk (state=3): >>><<< 13355 1727096192.41291: stderr chunk (state=3): >>><<< 13355 1727096192.41311: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096192.379995-15139-140750238259397=/root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096192.41434: variable 'ansible_module_compression' from source: unknown 13355 1727096192.41675: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13355 1727096192.41712: variable 'ansible_facts' from source: unknown 13355 1727096192.42121: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/AnsiballZ_package_facts.py 13355 1727096192.42454: Sending initial data 13355 1727096192.42457: Sent initial data (161 bytes) 13355 1727096192.43946: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096192.44063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096192.44084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096192.44192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096192.45891: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096192.45935: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096192.46066: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp012ak0ba /root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/AnsiballZ_package_facts.py <<< 13355 1727096192.46072: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/AnsiballZ_package_facts.py" <<< 13355 1727096192.46106: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp012ak0ba" to remote "/root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/AnsiballZ_package_facts.py" <<< 13355 1727096192.49000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096192.49174: stderr chunk (state=3): >>><<< 13355 1727096192.49178: stdout chunk (state=3): >>><<< 13355 1727096192.49195: done transferring module to remote 13355 1727096192.49212: _low_level_execute_command(): starting 13355 1727096192.49221: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/ /root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/AnsiballZ_package_facts.py && sleep 0' 13355 1727096192.50593: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096192.50733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096192.50753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096192.50801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096192.50935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096192.53206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096192.53308: stderr chunk (state=3): >>><<< 13355 1727096192.53322: stdout chunk (state=3): >>><<< 13355 1727096192.53400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096192.53409: _low_level_execute_command(): starting 13355 1727096192.53496: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/AnsiballZ_package_facts.py && sleep 0' 13355 1727096192.55003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096192.55007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096192.55010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096192.55076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096193.00919: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13355 1727096193.02829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096193.02843: stdout chunk (state=3): >>><<< 13355 1727096193.02863: stderr chunk (state=3): >>><<< 13355 1727096193.02908: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096193.05847: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096193.05888: _low_level_execute_command(): starting 13355 1727096193.05900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096192.379995-15139-140750238259397/ > /dev/null 2>&1 && sleep 0' 13355 1727096193.06538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096193.06581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096193.06598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096193.06614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096193.06700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096193.06721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096193.06798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096193.08763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096193.08781: stdout chunk (state=3): >>><<< 13355 1727096193.08805: stderr chunk (state=3): >>><<< 13355 1727096193.08826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096193.08838: handler run complete 13355 1727096193.09774: variable 'ansible_facts' from source: unknown 13355 1727096193.10272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.12438: variable 'ansible_facts' from source: unknown 13355 1727096193.12932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.13776: attempt loop complete, returning result 13355 1727096193.13780: _execute() done 13355 1727096193.13783: dumping result to json 13355 1727096193.13983: done dumping result, returning 13355 1727096193.14175: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-c514-593f-000000000855] 13355 1727096193.14179: sending task result for task 0afff68d-5257-c514-593f-000000000855 13355 1727096193.16677: done sending task result for task 0afff68d-5257-c514-593f-000000000855 13355 1727096193.16680: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096193.16833: no more pending results, returning what we have 13355 1727096193.16836: results queue empty 13355 1727096193.16836: checking for any_errors_fatal 13355 1727096193.16842: done checking for any_errors_fatal 13355 1727096193.16842: checking for max_fail_percentage 13355 1727096193.16844: done checking for max_fail_percentage 13355 1727096193.16844: checking to see if all hosts have failed and the running result is not ok 13355 1727096193.16845: done checking to see if all hosts have failed 13355 1727096193.16846: getting the remaining hosts for this loop 13355 1727096193.16847: done getting the remaining hosts for this loop 13355 1727096193.16850: getting the next task for host managed_node3 13355 1727096193.16859: done getting next task for host managed_node3 13355 1727096193.16863: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096193.16865: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096193.16885: getting variables 13355 1727096193.16887: in VariableManager get_vars() 13355 1727096193.16929: Calling all_inventory to load vars for managed_node3 13355 1727096193.16932: Calling groups_inventory to load vars for managed_node3 13355 1727096193.16934: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096193.16942: Calling all_plugins_play to load vars for managed_node3 13355 1727096193.16945: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096193.16948: Calling groups_plugins_play to load vars for managed_node3 13355 1727096193.18220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.20003: done with get_vars() 13355 1727096193.20026: done getting variables 13355 1727096193.20093: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:56:33 -0400 (0:00:00.886) 0:00:42.461 ****** 13355 1727096193.20131: entering _queue_task() for managed_node3/debug 13355 1727096193.20696: worker is 1 (out of 1 available) 13355 1727096193.20707: exiting _queue_task() for managed_node3/debug 13355 1727096193.20717: done queuing things up, now waiting for results queue to drain 13355 1727096193.20719: waiting for pending results... 13355 1727096193.20963: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096193.21005: in run() - task 0afff68d-5257-c514-593f-00000000011c 13355 1727096193.21028: variable 'ansible_search_path' from source: unknown 13355 1727096193.21039: variable 'ansible_search_path' from source: unknown 13355 1727096193.21095: calling self._execute() 13355 1727096193.21208: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.21219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.21232: variable 'omit' from source: magic vars 13355 1727096193.21649: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.21670: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096193.21681: variable 'omit' from source: magic vars 13355 1727096193.21748: variable 'omit' from source: magic vars 13355 1727096193.21861: variable 'network_provider' from source: set_fact 13355 1727096193.21888: variable 'omit' from source: magic vars 13355 1727096193.21946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096193.22041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096193.22044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096193.22046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096193.22048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096193.22085: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096193.22095: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.22102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.22216: Set connection var ansible_shell_executable to /bin/sh 13355 1727096193.22227: Set connection var ansible_shell_type to sh 13355 1727096193.22239: Set connection var ansible_pipelining to False 13355 1727096193.22370: Set connection var ansible_connection to ssh 13355 1727096193.22373: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096193.22376: Set connection var ansible_timeout to 10 13355 1727096193.22378: variable 'ansible_shell_executable' from source: unknown 13355 1727096193.22380: variable 'ansible_connection' from source: unknown 13355 1727096193.22382: variable 'ansible_module_compression' from source: unknown 13355 1727096193.22384: variable 'ansible_shell_type' from source: unknown 13355 1727096193.22386: variable 'ansible_shell_executable' from source: unknown 13355 1727096193.22388: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.22390: variable 'ansible_pipelining' from source: unknown 13355 1727096193.22392: variable 'ansible_timeout' from source: unknown 13355 1727096193.22394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.22511: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096193.22528: variable 'omit' from source: magic vars 13355 1727096193.22538: starting attempt loop 13355 1727096193.22586: running the handler 13355 1727096193.22610: handler run complete 13355 1727096193.22635: attempt loop complete, returning result 13355 1727096193.22641: _execute() done 13355 1727096193.22647: dumping result to json 13355 1727096193.22653: done dumping result, returning 13355 1727096193.22666: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-c514-593f-00000000011c] 13355 1727096193.22677: sending task result for task 0afff68d-5257-c514-593f-00000000011c ok: [managed_node3] => {} MSG: Using network provider: nm 13355 1727096193.22959: no more pending results, returning what we have 13355 1727096193.22963: results queue empty 13355 1727096193.22964: checking for any_errors_fatal 13355 1727096193.22978: done checking for any_errors_fatal 13355 1727096193.22979: checking for max_fail_percentage 13355 1727096193.22982: done checking for max_fail_percentage 13355 1727096193.22982: checking to see if all hosts have failed and the running result is not ok 13355 1727096193.22983: done checking to see if all hosts have failed 13355 1727096193.22984: getting the remaining hosts for this loop 13355 1727096193.22985: done getting the remaining hosts for this loop 13355 1727096193.22989: getting the next task for host managed_node3 13355 1727096193.22997: done getting next task for host managed_node3 13355 1727096193.23001: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096193.23005: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096193.23019: getting variables 13355 1727096193.23021: in VariableManager get_vars() 13355 1727096193.23203: Calling all_inventory to load vars for managed_node3 13355 1727096193.23206: Calling groups_inventory to load vars for managed_node3 13355 1727096193.23209: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096193.23220: Calling all_plugins_play to load vars for managed_node3 13355 1727096193.23224: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096193.23227: Calling groups_plugins_play to load vars for managed_node3 13355 1727096193.23987: done sending task result for task 0afff68d-5257-c514-593f-00000000011c 13355 1727096193.23991: WORKER PROCESS EXITING 13355 1727096193.24881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.26534: done with get_vars() 13355 1727096193.26569: done getting variables 13355 1727096193.26638: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:56:33 -0400 (0:00:00.065) 0:00:42.527 ****** 13355 1727096193.26680: entering _queue_task() for managed_node3/fail 13355 1727096193.27291: worker is 1 (out of 1 available) 13355 1727096193.27301: exiting _queue_task() for managed_node3/fail 13355 1727096193.27313: done queuing things up, now waiting for results queue to drain 13355 1727096193.27314: waiting for pending results... 13355 1727096193.27419: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096193.27572: in run() - task 0afff68d-5257-c514-593f-00000000011d 13355 1727096193.27593: variable 'ansible_search_path' from source: unknown 13355 1727096193.27600: variable 'ansible_search_path' from source: unknown 13355 1727096193.27640: calling self._execute() 13355 1727096193.27761: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.27779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.27795: variable 'omit' from source: magic vars 13355 1727096193.28206: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.28225: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096193.28372: variable 'network_state' from source: role '' defaults 13355 1727096193.28390: Evaluated conditional (network_state != {}): False 13355 1727096193.28398: when evaluation is False, skipping this task 13355 1727096193.28416: _execute() done 13355 1727096193.28424: dumping result to json 13355 1727096193.28524: done dumping result, returning 13355 1727096193.28528: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-c514-593f-00000000011d] 13355 1727096193.28531: sending task result for task 0afff68d-5257-c514-593f-00000000011d 13355 1727096193.28609: done sending task result for task 0afff68d-5257-c514-593f-00000000011d 13355 1727096193.28613: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096193.28686: no more pending results, returning what we have 13355 1727096193.28691: results queue empty 13355 1727096193.28692: checking for any_errors_fatal 13355 1727096193.28701: done checking for any_errors_fatal 13355 1727096193.28702: checking for max_fail_percentage 13355 1727096193.28704: done checking for max_fail_percentage 13355 1727096193.28705: checking to see if all hosts have failed and the running result is not ok 13355 1727096193.28706: done checking to see if all hosts have failed 13355 1727096193.28706: getting the remaining hosts for this loop 13355 1727096193.28708: done getting the remaining hosts for this loop 13355 1727096193.28712: getting the next task for host managed_node3 13355 1727096193.28719: done getting next task for host managed_node3 13355 1727096193.28724: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096193.28727: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096193.28755: getting variables 13355 1727096193.28760: in VariableManager get_vars() 13355 1727096193.28821: Calling all_inventory to load vars for managed_node3 13355 1727096193.28824: Calling groups_inventory to load vars for managed_node3 13355 1727096193.28826: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096193.28839: Calling all_plugins_play to load vars for managed_node3 13355 1727096193.28843: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096193.28846: Calling groups_plugins_play to load vars for managed_node3 13355 1727096193.30755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.38702: done with get_vars() 13355 1727096193.38732: done getting variables 13355 1727096193.38820: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:56:33 -0400 (0:00:00.121) 0:00:42.649 ****** 13355 1727096193.38854: entering _queue_task() for managed_node3/fail 13355 1727096193.39362: worker is 1 (out of 1 available) 13355 1727096193.39374: exiting _queue_task() for managed_node3/fail 13355 1727096193.39384: done queuing things up, now waiting for results queue to drain 13355 1727096193.39385: waiting for pending results... 13355 1727096193.39629: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096193.39779: in run() - task 0afff68d-5257-c514-593f-00000000011e 13355 1727096193.39830: variable 'ansible_search_path' from source: unknown 13355 1727096193.39837: variable 'ansible_search_path' from source: unknown 13355 1727096193.39861: calling self._execute() 13355 1727096193.39981: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.40047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.40051: variable 'omit' from source: magic vars 13355 1727096193.40483: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.40542: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096193.40665: variable 'network_state' from source: role '' defaults 13355 1727096193.40685: Evaluated conditional (network_state != {}): False 13355 1727096193.40694: when evaluation is False, skipping this task 13355 1727096193.40701: _execute() done 13355 1727096193.40763: dumping result to json 13355 1727096193.40768: done dumping result, returning 13355 1727096193.40772: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-c514-593f-00000000011e] 13355 1727096193.40775: sending task result for task 0afff68d-5257-c514-593f-00000000011e skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096193.40921: no more pending results, returning what we have 13355 1727096193.40926: results queue empty 13355 1727096193.40927: checking for any_errors_fatal 13355 1727096193.40936: done checking for any_errors_fatal 13355 1727096193.40937: checking for max_fail_percentage 13355 1727096193.40939: done checking for max_fail_percentage 13355 1727096193.40940: checking to see if all hosts have failed and the running result is not ok 13355 1727096193.40941: done checking to see if all hosts have failed 13355 1727096193.40942: getting the remaining hosts for this loop 13355 1727096193.40944: done getting the remaining hosts for this loop 13355 1727096193.40948: getting the next task for host managed_node3 13355 1727096193.40958: done getting next task for host managed_node3 13355 1727096193.40963: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096193.40966: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096193.41205: getting variables 13355 1727096193.41207: in VariableManager get_vars() 13355 1727096193.41266: Calling all_inventory to load vars for managed_node3 13355 1727096193.41274: Calling groups_inventory to load vars for managed_node3 13355 1727096193.41277: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096193.41295: done sending task result for task 0afff68d-5257-c514-593f-00000000011e 13355 1727096193.41298: WORKER PROCESS EXITING 13355 1727096193.41382: Calling all_plugins_play to load vars for managed_node3 13355 1727096193.41387: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096193.41390: Calling groups_plugins_play to load vars for managed_node3 13355 1727096193.42904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.46009: done with get_vars() 13355 1727096193.46040: done getting variables 13355 1727096193.46199: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:56:33 -0400 (0:00:00.073) 0:00:42.722 ****** 13355 1727096193.46233: entering _queue_task() for managed_node3/fail 13355 1727096193.47083: worker is 1 (out of 1 available) 13355 1727096193.47097: exiting _queue_task() for managed_node3/fail 13355 1727096193.47111: done queuing things up, now waiting for results queue to drain 13355 1727096193.47112: waiting for pending results... 13355 1727096193.47502: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096193.47738: in run() - task 0afff68d-5257-c514-593f-00000000011f 13355 1727096193.47751: variable 'ansible_search_path' from source: unknown 13355 1727096193.47841: variable 'ansible_search_path' from source: unknown 13355 1727096193.47893: calling self._execute() 13355 1727096193.48109: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.48113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.48123: variable 'omit' from source: magic vars 13355 1727096193.48915: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.48974: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096193.49390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096193.53979: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096193.54411: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096193.54426: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096193.54463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096193.54491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096193.54564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.54598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.54624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.54714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.54717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.54762: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.54780: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13355 1727096193.54890: variable 'ansible_distribution' from source: facts 13355 1727096193.54894: variable '__network_rh_distros' from source: role '' defaults 13355 1727096193.54953: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13355 1727096193.55135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.55159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.55370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.55374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.55377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.55379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.55382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.55385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.55388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.55390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.55392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.55406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.55427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.55507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.55511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.55943: variable 'network_connections' from source: task vars 13355 1727096193.55952: variable 'port1_profile' from source: play vars 13355 1727096193.56054: variable 'port1_profile' from source: play vars 13355 1727096193.56060: variable 'port2_profile' from source: play vars 13355 1727096193.56311: variable 'port2_profile' from source: play vars 13355 1727096193.56319: variable 'network_state' from source: role '' defaults 13355 1727096193.56394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096193.56801: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096193.56841: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096193.57074: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096193.57125: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096193.57172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096193.57220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096193.57223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.57243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096193.57454: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13355 1727096193.57460: when evaluation is False, skipping this task 13355 1727096193.57463: _execute() done 13355 1727096193.57471: dumping result to json 13355 1727096193.57474: done dumping result, returning 13355 1727096193.57477: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-c514-593f-00000000011f] 13355 1727096193.57483: sending task result for task 0afff68d-5257-c514-593f-00000000011f skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13355 1727096193.57708: no more pending results, returning what we have 13355 1727096193.57712: results queue empty 13355 1727096193.57713: checking for any_errors_fatal 13355 1727096193.57718: done checking for any_errors_fatal 13355 1727096193.57719: checking for max_fail_percentage 13355 1727096193.57720: done checking for max_fail_percentage 13355 1727096193.57721: checking to see if all hosts have failed and the running result is not ok 13355 1727096193.57722: done checking to see if all hosts have failed 13355 1727096193.57723: getting the remaining hosts for this loop 13355 1727096193.57724: done getting the remaining hosts for this loop 13355 1727096193.57727: getting the next task for host managed_node3 13355 1727096193.57735: done getting next task for host managed_node3 13355 1727096193.57740: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096193.57743: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096193.57771: getting variables 13355 1727096193.57773: in VariableManager get_vars() 13355 1727096193.57828: Calling all_inventory to load vars for managed_node3 13355 1727096193.57833: Calling groups_inventory to load vars for managed_node3 13355 1727096193.57835: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096193.57846: Calling all_plugins_play to load vars for managed_node3 13355 1727096193.57850: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096193.57853: Calling groups_plugins_play to load vars for managed_node3 13355 1727096193.58425: done sending task result for task 0afff68d-5257-c514-593f-00000000011f 13355 1727096193.58429: WORKER PROCESS EXITING 13355 1727096193.61573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.64779: done with get_vars() 13355 1727096193.64810: done getting variables 13355 1727096193.64869: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:56:33 -0400 (0:00:00.186) 0:00:42.909 ****** 13355 1727096193.64900: entering _queue_task() for managed_node3/dnf 13355 1727096193.65250: worker is 1 (out of 1 available) 13355 1727096193.65264: exiting _queue_task() for managed_node3/dnf 13355 1727096193.65478: done queuing things up, now waiting for results queue to drain 13355 1727096193.65480: waiting for pending results... 13355 1727096193.65654: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096193.65775: in run() - task 0afff68d-5257-c514-593f-000000000120 13355 1727096193.65778: variable 'ansible_search_path' from source: unknown 13355 1727096193.65781: variable 'ansible_search_path' from source: unknown 13355 1727096193.66008: calling self._execute() 13355 1727096193.66475: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.66479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.66483: variable 'omit' from source: magic vars 13355 1727096193.67174: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.67178: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096193.67217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096193.71226: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096193.71420: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096193.71589: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096193.71623: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096193.71648: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096193.71834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.71862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.71887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.71942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.71955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.72143: variable 'ansible_distribution' from source: facts 13355 1727096193.72146: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.72149: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13355 1727096193.72209: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096193.72335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.72360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.72381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.72418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.72430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.72475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.72492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.72512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.72551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.72564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.72599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.72619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.72640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.72682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.72693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.72835: variable 'network_connections' from source: task vars 13355 1727096193.72847: variable 'port1_profile' from source: play vars 13355 1727096193.72973: variable 'port1_profile' from source: play vars 13355 1727096193.72976: variable 'port2_profile' from source: play vars 13355 1727096193.73182: variable 'port2_profile' from source: play vars 13355 1727096193.73252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096193.73761: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096193.73852: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096193.73855: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096193.74220: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096193.74317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096193.74321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096193.74332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.74426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096193.74612: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096193.75124: variable 'network_connections' from source: task vars 13355 1727096193.75127: variable 'port1_profile' from source: play vars 13355 1727096193.75234: variable 'port1_profile' from source: play vars 13355 1727096193.75245: variable 'port2_profile' from source: play vars 13355 1727096193.75424: variable 'port2_profile' from source: play vars 13355 1727096193.75464: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096193.75469: when evaluation is False, skipping this task 13355 1727096193.75472: _execute() done 13355 1727096193.75476: dumping result to json 13355 1727096193.75478: done dumping result, returning 13355 1727096193.75481: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000120] 13355 1727096193.75605: sending task result for task 0afff68d-5257-c514-593f-000000000120 13355 1727096193.75713: done sending task result for task 0afff68d-5257-c514-593f-000000000120 13355 1727096193.75716: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096193.75818: no more pending results, returning what we have 13355 1727096193.75823: results queue empty 13355 1727096193.75823: checking for any_errors_fatal 13355 1727096193.75831: done checking for any_errors_fatal 13355 1727096193.75832: checking for max_fail_percentage 13355 1727096193.75835: done checking for max_fail_percentage 13355 1727096193.75835: checking to see if all hosts have failed and the running result is not ok 13355 1727096193.75836: done checking to see if all hosts have failed 13355 1727096193.75836: getting the remaining hosts for this loop 13355 1727096193.75838: done getting the remaining hosts for this loop 13355 1727096193.75841: getting the next task for host managed_node3 13355 1727096193.75848: done getting next task for host managed_node3 13355 1727096193.75851: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096193.75854: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096193.75877: getting variables 13355 1727096193.75878: in VariableManager get_vars() 13355 1727096193.75935: Calling all_inventory to load vars for managed_node3 13355 1727096193.75937: Calling groups_inventory to load vars for managed_node3 13355 1727096193.75940: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096193.75950: Calling all_plugins_play to load vars for managed_node3 13355 1727096193.75954: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096193.75957: Calling groups_plugins_play to load vars for managed_node3 13355 1727096193.78625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.81110: done with get_vars() 13355 1727096193.81146: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096193.81236: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:56:33 -0400 (0:00:00.163) 0:00:43.073 ****** 13355 1727096193.81279: entering _queue_task() for managed_node3/yum 13355 1727096193.81800: worker is 1 (out of 1 available) 13355 1727096193.81817: exiting _queue_task() for managed_node3/yum 13355 1727096193.81831: done queuing things up, now waiting for results queue to drain 13355 1727096193.81833: waiting for pending results... 13355 1727096193.82383: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096193.82390: in run() - task 0afff68d-5257-c514-593f-000000000121 13355 1727096193.82393: variable 'ansible_search_path' from source: unknown 13355 1727096193.82396: variable 'ansible_search_path' from source: unknown 13355 1727096193.82405: calling self._execute() 13355 1727096193.82530: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.82541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.82560: variable 'omit' from source: magic vars 13355 1727096193.83001: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.83026: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096193.83241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096193.89666: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096193.90124: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096193.90342: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096193.90364: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096193.90399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096193.90538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096193.90702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096193.90733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096193.90892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096193.90914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096193.91117: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.91138: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13355 1727096193.91273: when evaluation is False, skipping this task 13355 1727096193.91277: _execute() done 13355 1727096193.91279: dumping result to json 13355 1727096193.91282: done dumping result, returning 13355 1727096193.91284: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000121] 13355 1727096193.91287: sending task result for task 0afff68d-5257-c514-593f-000000000121 13355 1727096193.91474: done sending task result for task 0afff68d-5257-c514-593f-000000000121 13355 1727096193.91478: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13355 1727096193.91536: no more pending results, returning what we have 13355 1727096193.91541: results queue empty 13355 1727096193.91542: checking for any_errors_fatal 13355 1727096193.91548: done checking for any_errors_fatal 13355 1727096193.91549: checking for max_fail_percentage 13355 1727096193.91551: done checking for max_fail_percentage 13355 1727096193.91552: checking to see if all hosts have failed and the running result is not ok 13355 1727096193.91552: done checking to see if all hosts have failed 13355 1727096193.91553: getting the remaining hosts for this loop 13355 1727096193.91555: done getting the remaining hosts for this loop 13355 1727096193.91561: getting the next task for host managed_node3 13355 1727096193.91571: done getting next task for host managed_node3 13355 1727096193.91576: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096193.91579: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096193.91601: getting variables 13355 1727096193.91603: in VariableManager get_vars() 13355 1727096193.91663: Calling all_inventory to load vars for managed_node3 13355 1727096193.91666: Calling groups_inventory to load vars for managed_node3 13355 1727096193.91884: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096193.91896: Calling all_plugins_play to load vars for managed_node3 13355 1727096193.91899: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096193.91902: Calling groups_plugins_play to load vars for managed_node3 13355 1727096193.95650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096193.98066: done with get_vars() 13355 1727096193.98098: done getting variables 13355 1727096193.98171: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:56:33 -0400 (0:00:00.169) 0:00:43.242 ****** 13355 1727096193.98208: entering _queue_task() for managed_node3/fail 13355 1727096193.98692: worker is 1 (out of 1 available) 13355 1727096193.98708: exiting _queue_task() for managed_node3/fail 13355 1727096193.98721: done queuing things up, now waiting for results queue to drain 13355 1727096193.98722: waiting for pending results... 13355 1727096193.98987: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096193.99110: in run() - task 0afff68d-5257-c514-593f-000000000122 13355 1727096193.99135: variable 'ansible_search_path' from source: unknown 13355 1727096193.99172: variable 'ansible_search_path' from source: unknown 13355 1727096193.99186: calling self._execute() 13355 1727096193.99298: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096193.99311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096193.99327: variable 'omit' from source: magic vars 13355 1727096193.99733: variable 'ansible_distribution_major_version' from source: facts 13355 1727096193.99792: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096193.99888: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096194.00099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096194.04175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096194.04220: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096194.04459: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096194.04463: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096194.04466: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096194.04648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.04733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.04874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.04925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.04941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.04995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.05017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.05158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.05274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.05355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.05362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.05476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.05502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.05545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.05558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.06114: variable 'network_connections' from source: task vars 13355 1727096194.06124: variable 'port1_profile' from source: play vars 13355 1727096194.06204: variable 'port1_profile' from source: play vars 13355 1727096194.06219: variable 'port2_profile' from source: play vars 13355 1727096194.06410: variable 'port2_profile' from source: play vars 13355 1727096194.06550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096194.06987: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096194.07026: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096194.07178: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096194.07210: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096194.07258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096194.07401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096194.07426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.07452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096194.07573: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096194.08195: variable 'network_connections' from source: task vars 13355 1727096194.08199: variable 'port1_profile' from source: play vars 13355 1727096194.08402: variable 'port1_profile' from source: play vars 13355 1727096194.08405: variable 'port2_profile' from source: play vars 13355 1727096194.08459: variable 'port2_profile' from source: play vars 13355 1727096194.08599: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096194.08609: when evaluation is False, skipping this task 13355 1727096194.08612: _execute() done 13355 1727096194.08614: dumping result to json 13355 1727096194.08616: done dumping result, returning 13355 1727096194.08619: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000122] 13355 1727096194.08633: sending task result for task 0afff68d-5257-c514-593f-000000000122 13355 1727096194.08788: done sending task result for task 0afff68d-5257-c514-593f-000000000122 13355 1727096194.08792: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096194.08847: no more pending results, returning what we have 13355 1727096194.08851: results queue empty 13355 1727096194.08852: checking for any_errors_fatal 13355 1727096194.08861: done checking for any_errors_fatal 13355 1727096194.08862: checking for max_fail_percentage 13355 1727096194.08864: done checking for max_fail_percentage 13355 1727096194.08865: checking to see if all hosts have failed and the running result is not ok 13355 1727096194.08866: done checking to see if all hosts have failed 13355 1727096194.08867: getting the remaining hosts for this loop 13355 1727096194.08870: done getting the remaining hosts for this loop 13355 1727096194.08873: getting the next task for host managed_node3 13355 1727096194.08881: done getting next task for host managed_node3 13355 1727096194.08885: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13355 1727096194.08888: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096194.08911: getting variables 13355 1727096194.08913: in VariableManager get_vars() 13355 1727096194.09084: Calling all_inventory to load vars for managed_node3 13355 1727096194.09087: Calling groups_inventory to load vars for managed_node3 13355 1727096194.09091: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096194.09102: Calling all_plugins_play to load vars for managed_node3 13355 1727096194.09106: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096194.09110: Calling groups_plugins_play to load vars for managed_node3 13355 1727096194.11197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096194.14312: done with get_vars() 13355 1727096194.14340: done getting variables 13355 1727096194.14450: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:56:34 -0400 (0:00:00.162) 0:00:43.405 ****** 13355 1727096194.14489: entering _queue_task() for managed_node3/package 13355 1727096194.15424: worker is 1 (out of 1 available) 13355 1727096194.15440: exiting _queue_task() for managed_node3/package 13355 1727096194.15453: done queuing things up, now waiting for results queue to drain 13355 1727096194.15454: waiting for pending results... 13355 1727096194.16081: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13355 1727096194.16154: in run() - task 0afff68d-5257-c514-593f-000000000123 13355 1727096194.16376: variable 'ansible_search_path' from source: unknown 13355 1727096194.16380: variable 'ansible_search_path' from source: unknown 13355 1727096194.16386: calling self._execute() 13355 1727096194.16573: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096194.16577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096194.16580: variable 'omit' from source: magic vars 13355 1727096194.17449: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.17464: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096194.17870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096194.18573: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096194.18656: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096194.18888: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096194.19113: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096194.19250: variable 'network_packages' from source: role '' defaults 13355 1727096194.19386: variable '__network_provider_setup' from source: role '' defaults 13355 1727096194.19404: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096194.19485: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096194.19501: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096194.19578: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096194.19788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096194.23642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096194.23725: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096194.23856: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096194.23860: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096194.23862: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096194.23947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.24188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.24191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.24209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.24231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.24283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.24316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.24344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.24392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.24421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.24654: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096194.24978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.25389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.25393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.25396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.25398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.25935: variable 'ansible_python' from source: facts 13355 1727096194.25939: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096194.26051: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096194.26374: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096194.26437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.26507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.26607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.26719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.26739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.27076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.27088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.27091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.27100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.27120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.27359: variable 'network_connections' from source: task vars 13355 1727096194.27409: variable 'port1_profile' from source: play vars 13355 1727096194.27623: variable 'port1_profile' from source: play vars 13355 1727096194.27643: variable 'port2_profile' from source: play vars 13355 1727096194.27886: variable 'port2_profile' from source: play vars 13355 1727096194.28274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096194.28304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096194.28773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.28778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096194.28781: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096194.29680: variable 'network_connections' from source: task vars 13355 1727096194.29694: variable 'port1_profile' from source: play vars 13355 1727096194.29978: variable 'port1_profile' from source: play vars 13355 1727096194.30096: variable 'port2_profile' from source: play vars 13355 1727096194.30412: variable 'port2_profile' from source: play vars 13355 1727096194.30459: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096194.30612: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096194.31412: variable 'network_connections' from source: task vars 13355 1727096194.31425: variable 'port1_profile' from source: play vars 13355 1727096194.31674: variable 'port1_profile' from source: play vars 13355 1727096194.31677: variable 'port2_profile' from source: play vars 13355 1727096194.31725: variable 'port2_profile' from source: play vars 13355 1727096194.31873: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096194.31914: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096194.32640: variable 'network_connections' from source: task vars 13355 1727096194.32698: variable 'port1_profile' from source: play vars 13355 1727096194.33038: variable 'port1_profile' from source: play vars 13355 1727096194.33042: variable 'port2_profile' from source: play vars 13355 1727096194.33044: variable 'port2_profile' from source: play vars 13355 1727096194.33256: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096194.33259: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096194.33262: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096194.33410: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096194.33975: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096194.34892: variable 'network_connections' from source: task vars 13355 1727096194.34906: variable 'port1_profile' from source: play vars 13355 1727096194.35087: variable 'port1_profile' from source: play vars 13355 1727096194.35102: variable 'port2_profile' from source: play vars 13355 1727096194.35156: variable 'port2_profile' from source: play vars 13355 1727096194.35286: variable 'ansible_distribution' from source: facts 13355 1727096194.35294: variable '__network_rh_distros' from source: role '' defaults 13355 1727096194.35303: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.35319: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096194.35479: variable 'ansible_distribution' from source: facts 13355 1727096194.35608: variable '__network_rh_distros' from source: role '' defaults 13355 1727096194.35618: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.35634: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096194.36048: variable 'ansible_distribution' from source: facts 13355 1727096194.36057: variable '__network_rh_distros' from source: role '' defaults 13355 1727096194.36070: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.36124: variable 'network_provider' from source: set_fact 13355 1727096194.36165: variable 'ansible_facts' from source: unknown 13355 1727096194.37863: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13355 1727096194.37877: when evaluation is False, skipping this task 13355 1727096194.37885: _execute() done 13355 1727096194.37893: dumping result to json 13355 1727096194.37900: done dumping result, returning 13355 1727096194.37917: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-c514-593f-000000000123] 13355 1727096194.37933: sending task result for task 0afff68d-5257-c514-593f-000000000123 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13355 1727096194.38158: no more pending results, returning what we have 13355 1727096194.38163: results queue empty 13355 1727096194.38164: checking for any_errors_fatal 13355 1727096194.38174: done checking for any_errors_fatal 13355 1727096194.38175: checking for max_fail_percentage 13355 1727096194.38176: done checking for max_fail_percentage 13355 1727096194.38177: checking to see if all hosts have failed and the running result is not ok 13355 1727096194.38177: done checking to see if all hosts have failed 13355 1727096194.38178: getting the remaining hosts for this loop 13355 1727096194.38179: done getting the remaining hosts for this loop 13355 1727096194.38182: getting the next task for host managed_node3 13355 1727096194.38190: done getting next task for host managed_node3 13355 1727096194.38193: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096194.38196: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096194.38216: getting variables 13355 1727096194.38222: in VariableManager get_vars() 13355 1727096194.38386: Calling all_inventory to load vars for managed_node3 13355 1727096194.38389: Calling groups_inventory to load vars for managed_node3 13355 1727096194.38391: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096194.38402: Calling all_plugins_play to load vars for managed_node3 13355 1727096194.38405: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096194.38409: Calling groups_plugins_play to load vars for managed_node3 13355 1727096194.39571: done sending task result for task 0afff68d-5257-c514-593f-000000000123 13355 1727096194.39576: WORKER PROCESS EXITING 13355 1727096194.41163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096194.44527: done with get_vars() 13355 1727096194.44674: done getting variables 13355 1727096194.44730: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:56:34 -0400 (0:00:00.302) 0:00:43.708 ****** 13355 1727096194.44763: entering _queue_task() for managed_node3/package 13355 1727096194.45533: worker is 1 (out of 1 available) 13355 1727096194.45663: exiting _queue_task() for managed_node3/package 13355 1727096194.45678: done queuing things up, now waiting for results queue to drain 13355 1727096194.45680: waiting for pending results... 13355 1727096194.46029: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096194.46743: in run() - task 0afff68d-5257-c514-593f-000000000124 13355 1727096194.46748: variable 'ansible_search_path' from source: unknown 13355 1727096194.46751: variable 'ansible_search_path' from source: unknown 13355 1727096194.46779: calling self._execute() 13355 1727096194.46886: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096194.46891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096194.47152: variable 'omit' from source: magic vars 13355 1727096194.47553: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.47651: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096194.47706: variable 'network_state' from source: role '' defaults 13355 1727096194.47722: Evaluated conditional (network_state != {}): False 13355 1727096194.47730: when evaluation is False, skipping this task 13355 1727096194.47739: _execute() done 13355 1727096194.47747: dumping result to json 13355 1727096194.47763: done dumping result, returning 13355 1727096194.47779: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-c514-593f-000000000124] 13355 1727096194.47789: sending task result for task 0afff68d-5257-c514-593f-000000000124 13355 1727096194.48017: done sending task result for task 0afff68d-5257-c514-593f-000000000124 13355 1727096194.48020: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096194.48078: no more pending results, returning what we have 13355 1727096194.48083: results queue empty 13355 1727096194.48083: checking for any_errors_fatal 13355 1727096194.48092: done checking for any_errors_fatal 13355 1727096194.48093: checking for max_fail_percentage 13355 1727096194.48095: done checking for max_fail_percentage 13355 1727096194.48096: checking to see if all hosts have failed and the running result is not ok 13355 1727096194.48097: done checking to see if all hosts have failed 13355 1727096194.48097: getting the remaining hosts for this loop 13355 1727096194.48099: done getting the remaining hosts for this loop 13355 1727096194.48103: getting the next task for host managed_node3 13355 1727096194.48111: done getting next task for host managed_node3 13355 1727096194.48115: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096194.48118: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096194.48381: getting variables 13355 1727096194.48383: in VariableManager get_vars() 13355 1727096194.48433: Calling all_inventory to load vars for managed_node3 13355 1727096194.48436: Calling groups_inventory to load vars for managed_node3 13355 1727096194.48438: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096194.48447: Calling all_plugins_play to load vars for managed_node3 13355 1727096194.48450: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096194.48453: Calling groups_plugins_play to load vars for managed_node3 13355 1727096194.50034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096194.51611: done with get_vars() 13355 1727096194.51642: done getting variables 13355 1727096194.51698: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:56:34 -0400 (0:00:00.069) 0:00:43.777 ****** 13355 1727096194.51735: entering _queue_task() for managed_node3/package 13355 1727096194.52201: worker is 1 (out of 1 available) 13355 1727096194.52213: exiting _queue_task() for managed_node3/package 13355 1727096194.52224: done queuing things up, now waiting for results queue to drain 13355 1727096194.52225: waiting for pending results... 13355 1727096194.52464: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096194.52603: in run() - task 0afff68d-5257-c514-593f-000000000125 13355 1727096194.52613: variable 'ansible_search_path' from source: unknown 13355 1727096194.52670: variable 'ansible_search_path' from source: unknown 13355 1727096194.52674: calling self._execute() 13355 1727096194.52789: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096194.52801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096194.52823: variable 'omit' from source: magic vars 13355 1727096194.53230: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.53245: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096194.53379: variable 'network_state' from source: role '' defaults 13355 1727096194.53398: Evaluated conditional (network_state != {}): False 13355 1727096194.53426: when evaluation is False, skipping this task 13355 1727096194.53429: _execute() done 13355 1727096194.53433: dumping result to json 13355 1727096194.53435: done dumping result, returning 13355 1727096194.53442: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-c514-593f-000000000125] 13355 1727096194.53474: sending task result for task 0afff68d-5257-c514-593f-000000000125 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096194.53720: no more pending results, returning what we have 13355 1727096194.53724: results queue empty 13355 1727096194.53725: checking for any_errors_fatal 13355 1727096194.53732: done checking for any_errors_fatal 13355 1727096194.53733: checking for max_fail_percentage 13355 1727096194.53735: done checking for max_fail_percentage 13355 1727096194.53735: checking to see if all hosts have failed and the running result is not ok 13355 1727096194.53736: done checking to see if all hosts have failed 13355 1727096194.53737: getting the remaining hosts for this loop 13355 1727096194.53738: done getting the remaining hosts for this loop 13355 1727096194.53742: getting the next task for host managed_node3 13355 1727096194.53750: done getting next task for host managed_node3 13355 1727096194.53754: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096194.53757: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096194.53783: getting variables 13355 1727096194.53785: in VariableManager get_vars() 13355 1727096194.53846: Calling all_inventory to load vars for managed_node3 13355 1727096194.53849: Calling groups_inventory to load vars for managed_node3 13355 1727096194.53852: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096194.53865: Calling all_plugins_play to load vars for managed_node3 13355 1727096194.53987: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096194.54073: Calling groups_plugins_play to load vars for managed_node3 13355 1727096194.54604: done sending task result for task 0afff68d-5257-c514-593f-000000000125 13355 1727096194.54608: WORKER PROCESS EXITING 13355 1727096194.55445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096194.57065: done with get_vars() 13355 1727096194.57098: done getting variables 13355 1727096194.57170: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:56:34 -0400 (0:00:00.054) 0:00:43.832 ****** 13355 1727096194.57209: entering _queue_task() for managed_node3/service 13355 1727096194.57693: worker is 1 (out of 1 available) 13355 1727096194.57704: exiting _queue_task() for managed_node3/service 13355 1727096194.57715: done queuing things up, now waiting for results queue to drain 13355 1727096194.57716: waiting for pending results... 13355 1727096194.57908: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096194.58056: in run() - task 0afff68d-5257-c514-593f-000000000126 13355 1727096194.58080: variable 'ansible_search_path' from source: unknown 13355 1727096194.58088: variable 'ansible_search_path' from source: unknown 13355 1727096194.58136: calling self._execute() 13355 1727096194.58252: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096194.58271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096194.58287: variable 'omit' from source: magic vars 13355 1727096194.58702: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.58721: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096194.58856: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096194.59084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096194.61726: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096194.61797: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096194.61846: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096194.62072: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096194.62075: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096194.62079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.62082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.62085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.62099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.62116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.62161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.62187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.62220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.62265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.62287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.62337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.62360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.62391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.62438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.62453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.62626: variable 'network_connections' from source: task vars 13355 1727096194.62654: variable 'port1_profile' from source: play vars 13355 1727096194.62724: variable 'port1_profile' from source: play vars 13355 1727096194.62738: variable 'port2_profile' from source: play vars 13355 1727096194.62810: variable 'port2_profile' from source: play vars 13355 1727096194.62892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096194.63076: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096194.63107: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096194.63184: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096194.63188: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096194.63231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096194.63259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096194.63296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.63323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096194.63377: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096194.63630: variable 'network_connections' from source: task vars 13355 1727096194.63642: variable 'port1_profile' from source: play vars 13355 1727096194.63727: variable 'port1_profile' from source: play vars 13355 1727096194.63730: variable 'port2_profile' from source: play vars 13355 1727096194.63784: variable 'port2_profile' from source: play vars 13355 1727096194.63837: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096194.63840: when evaluation is False, skipping this task 13355 1727096194.63843: _execute() done 13355 1727096194.63845: dumping result to json 13355 1727096194.63847: done dumping result, returning 13355 1727096194.63850: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000126] 13355 1727096194.63945: sending task result for task 0afff68d-5257-c514-593f-000000000126 13355 1727096194.64011: done sending task result for task 0afff68d-5257-c514-593f-000000000126 13355 1727096194.64014: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096194.64100: no more pending results, returning what we have 13355 1727096194.64103: results queue empty 13355 1727096194.64104: checking for any_errors_fatal 13355 1727096194.64111: done checking for any_errors_fatal 13355 1727096194.64111: checking for max_fail_percentage 13355 1727096194.64114: done checking for max_fail_percentage 13355 1727096194.64115: checking to see if all hosts have failed and the running result is not ok 13355 1727096194.64116: done checking to see if all hosts have failed 13355 1727096194.64116: getting the remaining hosts for this loop 13355 1727096194.64118: done getting the remaining hosts for this loop 13355 1727096194.64122: getting the next task for host managed_node3 13355 1727096194.64129: done getting next task for host managed_node3 13355 1727096194.64133: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096194.64136: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096194.64325: getting variables 13355 1727096194.64328: in VariableManager get_vars() 13355 1727096194.64430: Calling all_inventory to load vars for managed_node3 13355 1727096194.64433: Calling groups_inventory to load vars for managed_node3 13355 1727096194.64436: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096194.64447: Calling all_plugins_play to load vars for managed_node3 13355 1727096194.64450: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096194.64453: Calling groups_plugins_play to load vars for managed_node3 13355 1727096194.66062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096194.67914: done with get_vars() 13355 1727096194.67950: done getting variables 13355 1727096194.68010: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:56:34 -0400 (0:00:00.108) 0:00:43.941 ****** 13355 1727096194.68048: entering _queue_task() for managed_node3/service 13355 1727096194.68584: worker is 1 (out of 1 available) 13355 1727096194.68597: exiting _queue_task() for managed_node3/service 13355 1727096194.68608: done queuing things up, now waiting for results queue to drain 13355 1727096194.68609: waiting for pending results... 13355 1727096194.68931: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096194.69133: in run() - task 0afff68d-5257-c514-593f-000000000127 13355 1727096194.69137: variable 'ansible_search_path' from source: unknown 13355 1727096194.69140: variable 'ansible_search_path' from source: unknown 13355 1727096194.69173: calling self._execute() 13355 1727096194.69294: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096194.69350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096194.69353: variable 'omit' from source: magic vars 13355 1727096194.69748: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.69770: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096194.70173: variable 'network_provider' from source: set_fact 13355 1727096194.70177: variable 'network_state' from source: role '' defaults 13355 1727096194.70180: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13355 1727096194.70182: variable 'omit' from source: magic vars 13355 1727096194.70185: variable 'omit' from source: magic vars 13355 1727096194.70187: variable 'network_service_name' from source: role '' defaults 13355 1727096194.70189: variable 'network_service_name' from source: role '' defaults 13355 1727096194.70261: variable '__network_provider_setup' from source: role '' defaults 13355 1727096194.70276: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096194.70347: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096194.70363: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096194.70433: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096194.70740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096194.75944: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096194.76033: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096194.76271: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096194.76274: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096194.76276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096194.76347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.76515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.76546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.76681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.76707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.76756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.76833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.76943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.76989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.77011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.77675: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096194.77820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.77850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.77880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.78040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.78059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.78222: variable 'ansible_python' from source: facts 13355 1727096194.78249: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096194.78406: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096194.78593: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096194.78902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.78932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.79004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.79132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.79151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.79308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096194.79348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096194.79426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.79634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096194.79637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096194.79770: variable 'network_connections' from source: task vars 13355 1727096194.79885: variable 'port1_profile' from source: play vars 13355 1727096194.79962: variable 'port1_profile' from source: play vars 13355 1727096194.80193: variable 'port2_profile' from source: play vars 13355 1727096194.80405: variable 'port2_profile' from source: play vars 13355 1727096194.80550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096194.80945: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096194.81114: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096194.81214: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096194.81257: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096194.81454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096194.81489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096194.81673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096194.81676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096194.81974: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096194.82409: variable 'network_connections' from source: task vars 13355 1727096194.82422: variable 'port1_profile' from source: play vars 13355 1727096194.82624: variable 'port1_profile' from source: play vars 13355 1727096194.82645: variable 'port2_profile' from source: play vars 13355 1727096194.82732: variable 'port2_profile' from source: play vars 13355 1727096194.82914: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096194.83112: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096194.83704: variable 'network_connections' from source: task vars 13355 1727096194.83714: variable 'port1_profile' from source: play vars 13355 1727096194.83973: variable 'port1_profile' from source: play vars 13355 1727096194.83976: variable 'port2_profile' from source: play vars 13355 1727096194.84076: variable 'port2_profile' from source: play vars 13355 1727096194.84109: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096194.84200: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096194.84801: variable 'network_connections' from source: task vars 13355 1727096194.84857: variable 'port1_profile' from source: play vars 13355 1727096194.85073: variable 'port1_profile' from source: play vars 13355 1727096194.85077: variable 'port2_profile' from source: play vars 13355 1727096194.85180: variable 'port2_profile' from source: play vars 13355 1727096194.85240: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096194.85409: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096194.85412: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096194.85738: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096194.86192: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096194.87027: variable 'network_connections' from source: task vars 13355 1727096194.87044: variable 'port1_profile' from source: play vars 13355 1727096194.87110: variable 'port1_profile' from source: play vars 13355 1727096194.87124: variable 'port2_profile' from source: play vars 13355 1727096194.87191: variable 'port2_profile' from source: play vars 13355 1727096194.87205: variable 'ansible_distribution' from source: facts 13355 1727096194.87214: variable '__network_rh_distros' from source: role '' defaults 13355 1727096194.87225: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.87245: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096194.87429: variable 'ansible_distribution' from source: facts 13355 1727096194.87440: variable '__network_rh_distros' from source: role '' defaults 13355 1727096194.87449: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.87469: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096194.87642: variable 'ansible_distribution' from source: facts 13355 1727096194.87651: variable '__network_rh_distros' from source: role '' defaults 13355 1727096194.87660: variable 'ansible_distribution_major_version' from source: facts 13355 1727096194.87707: variable 'network_provider' from source: set_fact 13355 1727096194.87736: variable 'omit' from source: magic vars 13355 1727096194.87771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096194.87808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096194.87829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096194.87851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096194.87866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096194.87908: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096194.87972: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096194.87976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096194.88025: Set connection var ansible_shell_executable to /bin/sh 13355 1727096194.88036: Set connection var ansible_shell_type to sh 13355 1727096194.88045: Set connection var ansible_pipelining to False 13355 1727096194.88054: Set connection var ansible_connection to ssh 13355 1727096194.88063: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096194.88075: Set connection var ansible_timeout to 10 13355 1727096194.88105: variable 'ansible_shell_executable' from source: unknown 13355 1727096194.88113: variable 'ansible_connection' from source: unknown 13355 1727096194.88125: variable 'ansible_module_compression' from source: unknown 13355 1727096194.88132: variable 'ansible_shell_type' from source: unknown 13355 1727096194.88145: variable 'ansible_shell_executable' from source: unknown 13355 1727096194.88152: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096194.88234: variable 'ansible_pipelining' from source: unknown 13355 1727096194.88237: variable 'ansible_timeout' from source: unknown 13355 1727096194.88240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096194.88289: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096194.88306: variable 'omit' from source: magic vars 13355 1727096194.88315: starting attempt loop 13355 1727096194.88321: running the handler 13355 1727096194.88405: variable 'ansible_facts' from source: unknown 13355 1727096194.89172: _low_level_execute_command(): starting 13355 1727096194.89185: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096194.89877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096194.89893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096194.89988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096194.90010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096194.90025: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096194.90194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096194.90318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096194.92003: stdout chunk (state=3): >>>/root <<< 13355 1727096194.92165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096194.92172: stdout chunk (state=3): >>><<< 13355 1727096194.92174: stderr chunk (state=3): >>><<< 13355 1727096194.92485: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096194.92489: _low_level_execute_command(): starting 13355 1727096194.92492: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557 `" && echo ansible-tmp-1727096194.9239008-15249-273211485176557="` echo /root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557 `" ) && sleep 0' 13355 1727096194.93509: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096194.93582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096194.93796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096194.93824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096194.93895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096194.95871: stdout chunk (state=3): >>>ansible-tmp-1727096194.9239008-15249-273211485176557=/root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557 <<< 13355 1727096194.96082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096194.96094: stderr chunk (state=3): >>><<< 13355 1727096194.96103: stdout chunk (state=3): >>><<< 13355 1727096194.96140: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096194.9239008-15249-273211485176557=/root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096194.96206: variable 'ansible_module_compression' from source: unknown 13355 1727096194.96362: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13355 1727096194.96453: variable 'ansible_facts' from source: unknown 13355 1727096194.97171: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/AnsiballZ_systemd.py 13355 1727096194.97418: Sending initial data 13355 1727096194.97582: Sent initial data (156 bytes) 13355 1727096194.98589: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096194.98652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096194.98784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096194.98874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096194.98900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096194.98923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096194.98989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096195.00723: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096195.00838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096195.00879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp13tqov42 /root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/AnsiballZ_systemd.py <<< 13355 1727096195.00886: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/AnsiballZ_systemd.py" <<< 13355 1727096195.00917: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp13tqov42" to remote "/root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/AnsiballZ_systemd.py" <<< 13355 1727096195.04863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096195.04934: stderr chunk (state=3): >>><<< 13355 1727096195.04937: stdout chunk (state=3): >>><<< 13355 1727096195.04963: done transferring module to remote 13355 1727096195.04976: _low_level_execute_command(): starting 13355 1727096195.04980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/ /root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/AnsiballZ_systemd.py && sleep 0' 13355 1727096195.06580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096195.06689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096195.06822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096195.08830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096195.08834: stdout chunk (state=3): >>><<< 13355 1727096195.08842: stderr chunk (state=3): >>><<< 13355 1727096195.08858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096195.08863: _low_level_execute_command(): starting 13355 1727096195.08871: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/AnsiballZ_systemd.py && sleep 0' 13355 1727096195.10609: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096195.10634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096195.10652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096195.10874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096195.10907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096195.10923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096195.11027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096195.40862: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10559488", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299442688", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1209292000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 13355 1727096195.40901: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13355 1727096195.42953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096195.42979: stdout chunk (state=3): >>><<< 13355 1727096195.42993: stderr chunk (state=3): >>><<< 13355 1727096195.43175: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10559488", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299442688", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1209292000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096195.43249: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096195.43392: _low_level_execute_command(): starting 13355 1727096195.43411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096194.9239008-15249-273211485176557/ > /dev/null 2>&1 && sleep 0' 13355 1727096195.44692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096195.44784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096195.44917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096195.44993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096195.45105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096195.46992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096195.47028: stderr chunk (state=3): >>><<< 13355 1727096195.47033: stdout chunk (state=3): >>><<< 13355 1727096195.47049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096195.47382: handler run complete 13355 1727096195.47386: attempt loop complete, returning result 13355 1727096195.47389: _execute() done 13355 1727096195.47391: dumping result to json 13355 1727096195.47393: done dumping result, returning 13355 1727096195.47395: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-c514-593f-000000000127] 13355 1727096195.47397: sending task result for task 0afff68d-5257-c514-593f-000000000127 13355 1727096195.47646: done sending task result for task 0afff68d-5257-c514-593f-000000000127 13355 1727096195.47649: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096195.47916: no more pending results, returning what we have 13355 1727096195.47919: results queue empty 13355 1727096195.47920: checking for any_errors_fatal 13355 1727096195.47924: done checking for any_errors_fatal 13355 1727096195.47925: checking for max_fail_percentage 13355 1727096195.47926: done checking for max_fail_percentage 13355 1727096195.47927: checking to see if all hosts have failed and the running result is not ok 13355 1727096195.47927: done checking to see if all hosts have failed 13355 1727096195.47928: getting the remaining hosts for this loop 13355 1727096195.47929: done getting the remaining hosts for this loop 13355 1727096195.47932: getting the next task for host managed_node3 13355 1727096195.47937: done getting next task for host managed_node3 13355 1727096195.47940: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096195.47942: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096195.47951: getting variables 13355 1727096195.47952: in VariableManager get_vars() 13355 1727096195.47996: Calling all_inventory to load vars for managed_node3 13355 1727096195.47999: Calling groups_inventory to load vars for managed_node3 13355 1727096195.48001: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096195.48009: Calling all_plugins_play to load vars for managed_node3 13355 1727096195.48013: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096195.48015: Calling groups_plugins_play to load vars for managed_node3 13355 1727096195.50960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096195.52851: done with get_vars() 13355 1727096195.52885: done getting variables 13355 1727096195.52945: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:56:35 -0400 (0:00:00.849) 0:00:44.790 ****** 13355 1727096195.52991: entering _queue_task() for managed_node3/service 13355 1727096195.53448: worker is 1 (out of 1 available) 13355 1727096195.53464: exiting _queue_task() for managed_node3/service 13355 1727096195.53480: done queuing things up, now waiting for results queue to drain 13355 1727096195.53481: waiting for pending results... 13355 1727096195.53990: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096195.54172: in run() - task 0afff68d-5257-c514-593f-000000000128 13355 1727096195.54177: variable 'ansible_search_path' from source: unknown 13355 1727096195.54180: variable 'ansible_search_path' from source: unknown 13355 1727096195.54182: calling self._execute() 13355 1727096195.54185: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096195.54188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096195.54190: variable 'omit' from source: magic vars 13355 1727096195.54488: variable 'ansible_distribution_major_version' from source: facts 13355 1727096195.54673: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096195.54676: variable 'network_provider' from source: set_fact 13355 1727096195.54679: Evaluated conditional (network_provider == "nm"): True 13355 1727096195.54734: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096195.54831: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096195.55176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096195.57247: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096195.57320: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096195.57354: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096195.57398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096195.57424: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096195.57523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096195.57551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096195.57580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096195.57628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096195.57642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096195.57692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096195.57722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096195.57746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096195.57789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096195.57802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096195.57848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096195.57877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096195.57896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096195.57941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096195.57954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096195.58108: variable 'network_connections' from source: task vars 13355 1727096195.58121: variable 'port1_profile' from source: play vars 13355 1727096195.58196: variable 'port1_profile' from source: play vars 13355 1727096195.58209: variable 'port2_profile' from source: play vars 13355 1727096195.58276: variable 'port2_profile' from source: play vars 13355 1727096195.58344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096195.58672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096195.58676: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096195.58678: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096195.58783: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096195.58786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096195.58789: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096195.58791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096195.58794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096195.58796: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096195.59311: variable 'network_connections' from source: task vars 13355 1727096195.59316: variable 'port1_profile' from source: play vars 13355 1727096195.59481: variable 'port1_profile' from source: play vars 13355 1727096195.59488: variable 'port2_profile' from source: play vars 13355 1727096195.59659: variable 'port2_profile' from source: play vars 13355 1727096195.59691: Evaluated conditional (__network_wpa_supplicant_required): False 13355 1727096195.59694: when evaluation is False, skipping this task 13355 1727096195.59707: _execute() done 13355 1727096195.59710: dumping result to json 13355 1727096195.59713: done dumping result, returning 13355 1727096195.59715: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-c514-593f-000000000128] 13355 1727096195.59717: sending task result for task 0afff68d-5257-c514-593f-000000000128 13355 1727096195.59864: done sending task result for task 0afff68d-5257-c514-593f-000000000128 13355 1727096195.59869: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13355 1727096195.59920: no more pending results, returning what we have 13355 1727096195.59924: results queue empty 13355 1727096195.59925: checking for any_errors_fatal 13355 1727096195.59948: done checking for any_errors_fatal 13355 1727096195.59949: checking for max_fail_percentage 13355 1727096195.59951: done checking for max_fail_percentage 13355 1727096195.59951: checking to see if all hosts have failed and the running result is not ok 13355 1727096195.59952: done checking to see if all hosts have failed 13355 1727096195.59953: getting the remaining hosts for this loop 13355 1727096195.59954: done getting the remaining hosts for this loop 13355 1727096195.59960: getting the next task for host managed_node3 13355 1727096195.59970: done getting next task for host managed_node3 13355 1727096195.59974: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096195.59977: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096195.59999: getting variables 13355 1727096195.60001: in VariableManager get_vars() 13355 1727096195.60055: Calling all_inventory to load vars for managed_node3 13355 1727096195.60061: Calling groups_inventory to load vars for managed_node3 13355 1727096195.60063: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096195.60476: Calling all_plugins_play to load vars for managed_node3 13355 1727096195.60481: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096195.60485: Calling groups_plugins_play to load vars for managed_node3 13355 1727096195.62272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096195.63925: done with get_vars() 13355 1727096195.63959: done getting variables 13355 1727096195.64025: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:56:35 -0400 (0:00:00.110) 0:00:44.901 ****** 13355 1727096195.64059: entering _queue_task() for managed_node3/service 13355 1727096195.64503: worker is 1 (out of 1 available) 13355 1727096195.64515: exiting _queue_task() for managed_node3/service 13355 1727096195.64640: done queuing things up, now waiting for results queue to drain 13355 1727096195.64642: waiting for pending results... 13355 1727096195.64989: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096195.64994: in run() - task 0afff68d-5257-c514-593f-000000000129 13355 1727096195.64997: variable 'ansible_search_path' from source: unknown 13355 1727096195.64999: variable 'ansible_search_path' from source: unknown 13355 1727096195.65174: calling self._execute() 13355 1727096195.65178: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096195.65181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096195.65184: variable 'omit' from source: magic vars 13355 1727096195.65526: variable 'ansible_distribution_major_version' from source: facts 13355 1727096195.65538: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096195.65872: variable 'network_provider' from source: set_fact 13355 1727096195.65875: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096195.65877: when evaluation is False, skipping this task 13355 1727096195.65878: _execute() done 13355 1727096195.65880: dumping result to json 13355 1727096195.65882: done dumping result, returning 13355 1727096195.65884: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-c514-593f-000000000129] 13355 1727096195.65885: sending task result for task 0afff68d-5257-c514-593f-000000000129 13355 1727096195.65947: done sending task result for task 0afff68d-5257-c514-593f-000000000129 13355 1727096195.65950: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096195.65991: no more pending results, returning what we have 13355 1727096195.65994: results queue empty 13355 1727096195.65995: checking for any_errors_fatal 13355 1727096195.66000: done checking for any_errors_fatal 13355 1727096195.66001: checking for max_fail_percentage 13355 1727096195.66002: done checking for max_fail_percentage 13355 1727096195.66003: checking to see if all hosts have failed and the running result is not ok 13355 1727096195.66004: done checking to see if all hosts have failed 13355 1727096195.66004: getting the remaining hosts for this loop 13355 1727096195.66006: done getting the remaining hosts for this loop 13355 1727096195.66009: getting the next task for host managed_node3 13355 1727096195.66014: done getting next task for host managed_node3 13355 1727096195.66018: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096195.66020: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096195.66038: getting variables 13355 1727096195.66040: in VariableManager get_vars() 13355 1727096195.66193: Calling all_inventory to load vars for managed_node3 13355 1727096195.66196: Calling groups_inventory to load vars for managed_node3 13355 1727096195.66198: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096195.66207: Calling all_plugins_play to load vars for managed_node3 13355 1727096195.66210: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096195.66212: Calling groups_plugins_play to load vars for managed_node3 13355 1727096195.67728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096195.69748: done with get_vars() 13355 1727096195.69784: done getting variables 13355 1727096195.69842: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:56:35 -0400 (0:00:00.060) 0:00:44.961 ****** 13355 1727096195.70083: entering _queue_task() for managed_node3/copy 13355 1727096195.70641: worker is 1 (out of 1 available) 13355 1727096195.70654: exiting _queue_task() for managed_node3/copy 13355 1727096195.70843: done queuing things up, now waiting for results queue to drain 13355 1727096195.70845: waiting for pending results... 13355 1727096195.71237: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096195.71449: in run() - task 0afff68d-5257-c514-593f-00000000012a 13355 1727096195.71596: variable 'ansible_search_path' from source: unknown 13355 1727096195.71619: variable 'ansible_search_path' from source: unknown 13355 1727096195.71704: calling self._execute() 13355 1727096195.71895: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096195.71981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096195.71998: variable 'omit' from source: magic vars 13355 1727096195.72747: variable 'ansible_distribution_major_version' from source: facts 13355 1727096195.72765: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096195.73081: variable 'network_provider' from source: set_fact 13355 1727096195.73086: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096195.73089: when evaluation is False, skipping this task 13355 1727096195.73092: _execute() done 13355 1727096195.73096: dumping result to json 13355 1727096195.73098: done dumping result, returning 13355 1727096195.73295: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-c514-593f-00000000012a] 13355 1727096195.73299: sending task result for task 0afff68d-5257-c514-593f-00000000012a 13355 1727096195.73397: done sending task result for task 0afff68d-5257-c514-593f-00000000012a 13355 1727096195.73401: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096195.73454: no more pending results, returning what we have 13355 1727096195.73462: results queue empty 13355 1727096195.73463: checking for any_errors_fatal 13355 1727096195.73472: done checking for any_errors_fatal 13355 1727096195.73473: checking for max_fail_percentage 13355 1727096195.73475: done checking for max_fail_percentage 13355 1727096195.73476: checking to see if all hosts have failed and the running result is not ok 13355 1727096195.73478: done checking to see if all hosts have failed 13355 1727096195.73479: getting the remaining hosts for this loop 13355 1727096195.73481: done getting the remaining hosts for this loop 13355 1727096195.73485: getting the next task for host managed_node3 13355 1727096195.73492: done getting next task for host managed_node3 13355 1727096195.73497: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096195.73501: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096195.73526: getting variables 13355 1727096195.73530: in VariableManager get_vars() 13355 1727096195.73789: Calling all_inventory to load vars for managed_node3 13355 1727096195.73792: Calling groups_inventory to load vars for managed_node3 13355 1727096195.73795: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096195.73806: Calling all_plugins_play to load vars for managed_node3 13355 1727096195.73810: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096195.73814: Calling groups_plugins_play to load vars for managed_node3 13355 1727096195.76734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096195.78875: done with get_vars() 13355 1727096195.78909: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:56:35 -0400 (0:00:00.091) 0:00:45.052 ****** 13355 1727096195.79201: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096195.79766: worker is 1 (out of 1 available) 13355 1727096195.79984: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096195.79999: done queuing things up, now waiting for results queue to drain 13355 1727096195.80000: waiting for pending results... 13355 1727096195.80584: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096195.80682: in run() - task 0afff68d-5257-c514-593f-00000000012b 13355 1727096195.80698: variable 'ansible_search_path' from source: unknown 13355 1727096195.80704: variable 'ansible_search_path' from source: unknown 13355 1727096195.80860: calling self._execute() 13355 1727096195.81035: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096195.81039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096195.81166: variable 'omit' from source: magic vars 13355 1727096195.81972: variable 'ansible_distribution_major_version' from source: facts 13355 1727096195.81984: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096195.81990: variable 'omit' from source: magic vars 13355 1727096195.82097: variable 'omit' from source: magic vars 13355 1727096195.82509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096195.85911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096195.85915: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096195.85947: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096195.85987: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096195.86014: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096195.86100: variable 'network_provider' from source: set_fact 13355 1727096195.86245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096195.86277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096195.86302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096195.86346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096195.86366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096195.86434: variable 'omit' from source: magic vars 13355 1727096195.86551: variable 'omit' from source: magic vars 13355 1727096195.86656: variable 'network_connections' from source: task vars 13355 1727096195.86690: variable 'port1_profile' from source: play vars 13355 1727096195.86735: variable 'port1_profile' from source: play vars 13355 1727096195.86781: variable 'port2_profile' from source: play vars 13355 1727096195.86811: variable 'port2_profile' from source: play vars 13355 1727096195.87062: variable 'omit' from source: magic vars 13355 1727096195.87065: variable '__lsr_ansible_managed' from source: task vars 13355 1727096195.87073: variable '__lsr_ansible_managed' from source: task vars 13355 1727096195.87546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13355 1727096195.87721: Loaded config def from plugin (lookup/template) 13355 1727096195.87724: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13355 1727096195.87751: File lookup term: get_ansible_managed.j2 13355 1727096195.87756: variable 'ansible_search_path' from source: unknown 13355 1727096195.87975: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13355 1727096195.87981: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13355 1727096195.87985: variable 'ansible_search_path' from source: unknown 13355 1727096195.95745: variable 'ansible_managed' from source: unknown 13355 1727096195.95910: variable 'omit' from source: magic vars 13355 1727096195.95936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096195.95965: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096195.95986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096195.96009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096195.96018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096195.96048: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096195.96051: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096195.96054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096195.96241: Set connection var ansible_shell_executable to /bin/sh 13355 1727096195.96244: Set connection var ansible_shell_type to sh 13355 1727096195.96248: Set connection var ansible_pipelining to False 13355 1727096195.96250: Set connection var ansible_connection to ssh 13355 1727096195.96252: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096195.96254: Set connection var ansible_timeout to 10 13355 1727096195.96256: variable 'ansible_shell_executable' from source: unknown 13355 1727096195.96258: variable 'ansible_connection' from source: unknown 13355 1727096195.96259: variable 'ansible_module_compression' from source: unknown 13355 1727096195.96262: variable 'ansible_shell_type' from source: unknown 13355 1727096195.96264: variable 'ansible_shell_executable' from source: unknown 13355 1727096195.96266: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096195.96269: variable 'ansible_pipelining' from source: unknown 13355 1727096195.96271: variable 'ansible_timeout' from source: unknown 13355 1727096195.96273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096195.96689: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096195.96704: variable 'omit' from source: magic vars 13355 1727096195.96706: starting attempt loop 13355 1727096195.96708: running the handler 13355 1727096195.96710: _low_level_execute_command(): starting 13355 1727096195.96712: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096195.97379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096195.97384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096195.97387: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096195.97389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096195.97623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096195.99199: stdout chunk (state=3): >>>/root <<< 13355 1727096195.99329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096195.99333: stdout chunk (state=3): >>><<< 13355 1727096195.99342: stderr chunk (state=3): >>><<< 13355 1727096195.99379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096195.99383: _low_level_execute_command(): starting 13355 1727096195.99388: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793 `" && echo ansible-tmp-1727096195.993697-15309-112050488275793="` echo /root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793 `" ) && sleep 0' 13355 1727096196.00107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096196.00116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096196.00128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096196.00184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096196.00234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096196.00253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096196.00271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096196.00322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096196.02337: stdout chunk (state=3): >>>ansible-tmp-1727096195.993697-15309-112050488275793=/root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793 <<< 13355 1727096196.02501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096196.02505: stdout chunk (state=3): >>><<< 13355 1727096196.02507: stderr chunk (state=3): >>><<< 13355 1727096196.02529: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096195.993697-15309-112050488275793=/root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096196.02609: variable 'ansible_module_compression' from source: unknown 13355 1727096196.02641: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13355 1727096196.02702: variable 'ansible_facts' from source: unknown 13355 1727096196.02854: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/AnsiballZ_network_connections.py 13355 1727096196.03084: Sending initial data 13355 1727096196.03101: Sent initial data (167 bytes) 13355 1727096196.03693: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096196.03791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096196.03839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096196.03864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096196.03887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096196.03969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096196.05634: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096196.05692: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096196.05717: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp6v4iy2ek /root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/AnsiballZ_network_connections.py <<< 13355 1727096196.05721: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/AnsiballZ_network_connections.py" <<< 13355 1727096196.05748: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp6v4iy2ek" to remote "/root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/AnsiballZ_network_connections.py" <<< 13355 1727096196.06858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096196.06899: stderr chunk (state=3): >>><<< 13355 1727096196.06908: stdout chunk (state=3): >>><<< 13355 1727096196.06983: done transferring module to remote 13355 1727096196.06988: _low_level_execute_command(): starting 13355 1727096196.06990: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/ /root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/AnsiballZ_network_connections.py && sleep 0' 13355 1727096196.07874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096196.07878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096196.07880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096196.07884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096196.07946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096196.07950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096196.07982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096196.10081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096196.10086: stdout chunk (state=3): >>><<< 13355 1727096196.10089: stderr chunk (state=3): >>><<< 13355 1727096196.10201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096196.10205: _low_level_execute_command(): starting 13355 1727096196.10208: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/AnsiballZ_network_connections.py && sleep 0' 13355 1727096196.11376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096196.11684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096196.11711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096196.11727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096196.11810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096196.53540: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mrpcsub7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 13355 1727096196.53564: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mrpcsub7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0dd2062c-1ff8-43a7-a41c-6a1fd34b6980: error=unknown <<< 13355 1727096196.55478: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mrpcsub7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mrpcsub7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/c28ac129-a5cf-428d-a75a-c74a7d1cb1ab: error=unknown <<< 13355 1727096196.55682: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13355 1727096196.57730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096196.57734: stdout chunk (state=3): >>><<< 13355 1727096196.57736: stderr chunk (state=3): >>><<< 13355 1727096196.57874: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mrpcsub7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mrpcsub7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0dd2062c-1ff8-43a7-a41c-6a1fd34b6980: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mrpcsub7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_mrpcsub7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/c28ac129-a5cf-428d-a75a-c74a7d1cb1ab: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0.1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096196.57879: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096196.57882: _low_level_execute_command(): starting 13355 1727096196.57884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096195.993697-15309-112050488275793/ > /dev/null 2>&1 && sleep 0' 13355 1727096196.58517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096196.58531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096196.58559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096196.58583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096196.58677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096196.58705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096196.58729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096196.58755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096196.58842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096196.60738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096196.60742: stdout chunk (state=3): >>><<< 13355 1727096196.60745: stderr chunk (state=3): >>><<< 13355 1727096196.60804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096196.60807: handler run complete 13355 1727096196.60810: attempt loop complete, returning result 13355 1727096196.60812: _execute() done 13355 1727096196.60815: dumping result to json 13355 1727096196.60822: done dumping result, returning 13355 1727096196.60831: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-c514-593f-00000000012b] 13355 1727096196.60912: sending task result for task 0afff68d-5257-c514-593f-00000000012b 13355 1727096196.61021: done sending task result for task 0afff68d-5257-c514-593f-00000000012b changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13355 1727096196.61336: no more pending results, returning what we have 13355 1727096196.61340: results queue empty 13355 1727096196.61341: checking for any_errors_fatal 13355 1727096196.61345: done checking for any_errors_fatal 13355 1727096196.61346: checking for max_fail_percentage 13355 1727096196.61348: done checking for max_fail_percentage 13355 1727096196.61348: checking to see if all hosts have failed and the running result is not ok 13355 1727096196.61349: done checking to see if all hosts have failed 13355 1727096196.61350: getting the remaining hosts for this loop 13355 1727096196.61351: done getting the remaining hosts for this loop 13355 1727096196.61354: getting the next task for host managed_node3 13355 1727096196.61360: done getting next task for host managed_node3 13355 1727096196.61363: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096196.61366: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096196.61379: WORKER PROCESS EXITING 13355 1727096196.61386: getting variables 13355 1727096196.61388: in VariableManager get_vars() 13355 1727096196.61441: Calling all_inventory to load vars for managed_node3 13355 1727096196.61444: Calling groups_inventory to load vars for managed_node3 13355 1727096196.61447: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096196.61457: Calling all_plugins_play to load vars for managed_node3 13355 1727096196.61460: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096196.61464: Calling groups_plugins_play to load vars for managed_node3 13355 1727096196.63057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096196.64535: done with get_vars() 13355 1727096196.64563: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:56:36 -0400 (0:00:00.854) 0:00:45.907 ****** 13355 1727096196.64654: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096196.65013: worker is 1 (out of 1 available) 13355 1727096196.65025: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096196.65038: done queuing things up, now waiting for results queue to drain 13355 1727096196.65040: waiting for pending results... 13355 1727096196.65337: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096196.65479: in run() - task 0afff68d-5257-c514-593f-00000000012c 13355 1727096196.65484: variable 'ansible_search_path' from source: unknown 13355 1727096196.65496: variable 'ansible_search_path' from source: unknown 13355 1727096196.65527: calling self._execute() 13355 1727096196.65644: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.65649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.65651: variable 'omit' from source: magic vars 13355 1727096196.66097: variable 'ansible_distribution_major_version' from source: facts 13355 1727096196.66101: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096196.66142: variable 'network_state' from source: role '' defaults 13355 1727096196.66158: Evaluated conditional (network_state != {}): False 13355 1727096196.66162: when evaluation is False, skipping this task 13355 1727096196.66165: _execute() done 13355 1727096196.66168: dumping result to json 13355 1727096196.66171: done dumping result, returning 13355 1727096196.66318: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-c514-593f-00000000012c] 13355 1727096196.66322: sending task result for task 0afff68d-5257-c514-593f-00000000012c 13355 1727096196.66392: done sending task result for task 0afff68d-5257-c514-593f-00000000012c 13355 1727096196.66395: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096196.66446: no more pending results, returning what we have 13355 1727096196.66450: results queue empty 13355 1727096196.66451: checking for any_errors_fatal 13355 1727096196.66458: done checking for any_errors_fatal 13355 1727096196.66459: checking for max_fail_percentage 13355 1727096196.66462: done checking for max_fail_percentage 13355 1727096196.66463: checking to see if all hosts have failed and the running result is not ok 13355 1727096196.66463: done checking to see if all hosts have failed 13355 1727096196.66464: getting the remaining hosts for this loop 13355 1727096196.66465: done getting the remaining hosts for this loop 13355 1727096196.66470: getting the next task for host managed_node3 13355 1727096196.66477: done getting next task for host managed_node3 13355 1727096196.66481: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096196.66483: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096196.66504: getting variables 13355 1727096196.66505: in VariableManager get_vars() 13355 1727096196.66555: Calling all_inventory to load vars for managed_node3 13355 1727096196.66558: Calling groups_inventory to load vars for managed_node3 13355 1727096196.66561: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096196.66699: Calling all_plugins_play to load vars for managed_node3 13355 1727096196.66703: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096196.66706: Calling groups_plugins_play to load vars for managed_node3 13355 1727096196.68044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096196.70351: done with get_vars() 13355 1727096196.70379: done getting variables 13355 1727096196.70443: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:56:36 -0400 (0:00:00.058) 0:00:45.965 ****** 13355 1727096196.70480: entering _queue_task() for managed_node3/debug 13355 1727096196.70834: worker is 1 (out of 1 available) 13355 1727096196.70846: exiting _queue_task() for managed_node3/debug 13355 1727096196.70859: done queuing things up, now waiting for results queue to drain 13355 1727096196.70860: waiting for pending results... 13355 1727096196.71234: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096196.71332: in run() - task 0afff68d-5257-c514-593f-00000000012d 13355 1727096196.71336: variable 'ansible_search_path' from source: unknown 13355 1727096196.71339: variable 'ansible_search_path' from source: unknown 13355 1727096196.71342: calling self._execute() 13355 1727096196.71511: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.71515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.71518: variable 'omit' from source: magic vars 13355 1727096196.71809: variable 'ansible_distribution_major_version' from source: facts 13355 1727096196.71825: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096196.71828: variable 'omit' from source: magic vars 13355 1727096196.71888: variable 'omit' from source: magic vars 13355 1727096196.71936: variable 'omit' from source: magic vars 13355 1727096196.71980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096196.72020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096196.72040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096196.72060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096196.72113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096196.72122: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096196.72125: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.72128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.72317: Set connection var ansible_shell_executable to /bin/sh 13355 1727096196.72325: Set connection var ansible_shell_type to sh 13355 1727096196.72327: Set connection var ansible_pipelining to False 13355 1727096196.72330: Set connection var ansible_connection to ssh 13355 1727096196.72332: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096196.72334: Set connection var ansible_timeout to 10 13355 1727096196.72335: variable 'ansible_shell_executable' from source: unknown 13355 1727096196.72337: variable 'ansible_connection' from source: unknown 13355 1727096196.72340: variable 'ansible_module_compression' from source: unknown 13355 1727096196.72341: variable 'ansible_shell_type' from source: unknown 13355 1727096196.72344: variable 'ansible_shell_executable' from source: unknown 13355 1727096196.72346: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.72347: variable 'ansible_pipelining' from source: unknown 13355 1727096196.72349: variable 'ansible_timeout' from source: unknown 13355 1727096196.72351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.72463: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096196.72517: variable 'omit' from source: magic vars 13355 1727096196.72520: starting attempt loop 13355 1727096196.72523: running the handler 13355 1727096196.72621: variable '__network_connections_result' from source: set_fact 13355 1727096196.72676: handler run complete 13355 1727096196.72873: attempt loop complete, returning result 13355 1727096196.72876: _execute() done 13355 1727096196.72878: dumping result to json 13355 1727096196.72879: done dumping result, returning 13355 1727096196.72881: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-c514-593f-00000000012d] 13355 1727096196.72883: sending task result for task 0afff68d-5257-c514-593f-00000000012d 13355 1727096196.72945: done sending task result for task 0afff68d-5257-c514-593f-00000000012d 13355 1727096196.72948: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 13355 1727096196.73020: no more pending results, returning what we have 13355 1727096196.73024: results queue empty 13355 1727096196.73025: checking for any_errors_fatal 13355 1727096196.73033: done checking for any_errors_fatal 13355 1727096196.73034: checking for max_fail_percentage 13355 1727096196.73036: done checking for max_fail_percentage 13355 1727096196.73037: checking to see if all hosts have failed and the running result is not ok 13355 1727096196.73038: done checking to see if all hosts have failed 13355 1727096196.73038: getting the remaining hosts for this loop 13355 1727096196.73040: done getting the remaining hosts for this loop 13355 1727096196.73043: getting the next task for host managed_node3 13355 1727096196.73050: done getting next task for host managed_node3 13355 1727096196.73055: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096196.73058: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096196.73071: getting variables 13355 1727096196.73073: in VariableManager get_vars() 13355 1727096196.73125: Calling all_inventory to load vars for managed_node3 13355 1727096196.73128: Calling groups_inventory to load vars for managed_node3 13355 1727096196.73130: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096196.73140: Calling all_plugins_play to load vars for managed_node3 13355 1727096196.73143: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096196.73146: Calling groups_plugins_play to load vars for managed_node3 13355 1727096196.74556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096196.76080: done with get_vars() 13355 1727096196.76110: done getting variables 13355 1727096196.76171: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:56:36 -0400 (0:00:00.057) 0:00:46.022 ****** 13355 1727096196.76207: entering _queue_task() for managed_node3/debug 13355 1727096196.76572: worker is 1 (out of 1 available) 13355 1727096196.76584: exiting _queue_task() for managed_node3/debug 13355 1727096196.76597: done queuing things up, now waiting for results queue to drain 13355 1727096196.76598: waiting for pending results... 13355 1727096196.76988: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096196.77015: in run() - task 0afff68d-5257-c514-593f-00000000012e 13355 1727096196.77085: variable 'ansible_search_path' from source: unknown 13355 1727096196.77089: variable 'ansible_search_path' from source: unknown 13355 1727096196.77092: calling self._execute() 13355 1727096196.77473: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.77479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.77482: variable 'omit' from source: magic vars 13355 1727096196.77603: variable 'ansible_distribution_major_version' from source: facts 13355 1727096196.77607: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096196.77610: variable 'omit' from source: magic vars 13355 1727096196.77639: variable 'omit' from source: magic vars 13355 1727096196.77683: variable 'omit' from source: magic vars 13355 1727096196.77722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096196.77759: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096196.77784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096196.77800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096196.77811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096196.78134: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096196.78137: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.78143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.78145: Set connection var ansible_shell_executable to /bin/sh 13355 1727096196.78148: Set connection var ansible_shell_type to sh 13355 1727096196.78150: Set connection var ansible_pipelining to False 13355 1727096196.78152: Set connection var ansible_connection to ssh 13355 1727096196.78154: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096196.78158: Set connection var ansible_timeout to 10 13355 1727096196.78161: variable 'ansible_shell_executable' from source: unknown 13355 1727096196.78163: variable 'ansible_connection' from source: unknown 13355 1727096196.78165: variable 'ansible_module_compression' from source: unknown 13355 1727096196.78168: variable 'ansible_shell_type' from source: unknown 13355 1727096196.78171: variable 'ansible_shell_executable' from source: unknown 13355 1727096196.78173: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.78175: variable 'ansible_pipelining' from source: unknown 13355 1727096196.78177: variable 'ansible_timeout' from source: unknown 13355 1727096196.78179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.78182: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096196.78184: variable 'omit' from source: magic vars 13355 1727096196.78186: starting attempt loop 13355 1727096196.78189: running the handler 13355 1727096196.78525: variable '__network_connections_result' from source: set_fact 13355 1727096196.78529: variable '__network_connections_result' from source: set_fact 13355 1727096196.78531: handler run complete 13355 1727096196.78533: attempt loop complete, returning result 13355 1727096196.78534: _execute() done 13355 1727096196.78536: dumping result to json 13355 1727096196.78537: done dumping result, returning 13355 1727096196.78541: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-c514-593f-00000000012e] 13355 1727096196.78543: sending task result for task 0afff68d-5257-c514-593f-00000000012e 13355 1727096196.78611: done sending task result for task 0afff68d-5257-c514-593f-00000000012e 13355 1727096196.78614: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0.1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13355 1727096196.78723: no more pending results, returning what we have 13355 1727096196.78727: results queue empty 13355 1727096196.78728: checking for any_errors_fatal 13355 1727096196.78734: done checking for any_errors_fatal 13355 1727096196.78735: checking for max_fail_percentage 13355 1727096196.78737: done checking for max_fail_percentage 13355 1727096196.78737: checking to see if all hosts have failed and the running result is not ok 13355 1727096196.78738: done checking to see if all hosts have failed 13355 1727096196.78739: getting the remaining hosts for this loop 13355 1727096196.78741: done getting the remaining hosts for this loop 13355 1727096196.78745: getting the next task for host managed_node3 13355 1727096196.78753: done getting next task for host managed_node3 13355 1727096196.78757: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096196.78759: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096196.78775: getting variables 13355 1727096196.78777: in VariableManager get_vars() 13355 1727096196.78833: Calling all_inventory to load vars for managed_node3 13355 1727096196.78836: Calling groups_inventory to load vars for managed_node3 13355 1727096196.78839: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096196.78850: Calling all_plugins_play to load vars for managed_node3 13355 1727096196.78854: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096196.78858: Calling groups_plugins_play to load vars for managed_node3 13355 1727096196.80528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096196.82050: done with get_vars() 13355 1727096196.82086: done getting variables 13355 1727096196.82147: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:56:36 -0400 (0:00:00.059) 0:00:46.082 ****** 13355 1727096196.82189: entering _queue_task() for managed_node3/debug 13355 1727096196.82546: worker is 1 (out of 1 available) 13355 1727096196.82560: exiting _queue_task() for managed_node3/debug 13355 1727096196.82577: done queuing things up, now waiting for results queue to drain 13355 1727096196.82578: waiting for pending results... 13355 1727096196.83235: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096196.83405: in run() - task 0afff68d-5257-c514-593f-00000000012f 13355 1727096196.83535: variable 'ansible_search_path' from source: unknown 13355 1727096196.83539: variable 'ansible_search_path' from source: unknown 13355 1727096196.83571: calling self._execute() 13355 1727096196.84474: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.84477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.84479: variable 'omit' from source: magic vars 13355 1727096196.85375: variable 'ansible_distribution_major_version' from source: facts 13355 1727096196.85378: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096196.85381: variable 'network_state' from source: role '' defaults 13355 1727096196.85384: Evaluated conditional (network_state != {}): False 13355 1727096196.85388: when evaluation is False, skipping this task 13355 1727096196.85390: _execute() done 13355 1727096196.85393: dumping result to json 13355 1727096196.85396: done dumping result, returning 13355 1727096196.85777: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-c514-593f-00000000012f] 13355 1727096196.85781: sending task result for task 0afff68d-5257-c514-593f-00000000012f 13355 1727096196.85851: done sending task result for task 0afff68d-5257-c514-593f-00000000012f 13355 1727096196.85855: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13355 1727096196.86010: no more pending results, returning what we have 13355 1727096196.86014: results queue empty 13355 1727096196.86015: checking for any_errors_fatal 13355 1727096196.86027: done checking for any_errors_fatal 13355 1727096196.86027: checking for max_fail_percentage 13355 1727096196.86030: done checking for max_fail_percentage 13355 1727096196.86030: checking to see if all hosts have failed and the running result is not ok 13355 1727096196.86031: done checking to see if all hosts have failed 13355 1727096196.86031: getting the remaining hosts for this loop 13355 1727096196.86033: done getting the remaining hosts for this loop 13355 1727096196.86036: getting the next task for host managed_node3 13355 1727096196.86043: done getting next task for host managed_node3 13355 1727096196.86047: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096196.86050: ^ state is: HOST STATE: block=2, task=29, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096196.86071: getting variables 13355 1727096196.86073: in VariableManager get_vars() 13355 1727096196.86119: Calling all_inventory to load vars for managed_node3 13355 1727096196.86121: Calling groups_inventory to load vars for managed_node3 13355 1727096196.86123: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096196.86133: Calling all_plugins_play to load vars for managed_node3 13355 1727096196.86135: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096196.86138: Calling groups_plugins_play to load vars for managed_node3 13355 1727096196.88873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096196.92014: done with get_vars() 13355 1727096196.92048: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:56:36 -0400 (0:00:00.099) 0:00:46.182 ****** 13355 1727096196.92154: entering _queue_task() for managed_node3/ping 13355 1727096196.92927: worker is 1 (out of 1 available) 13355 1727096196.92938: exiting _queue_task() for managed_node3/ping 13355 1727096196.92951: done queuing things up, now waiting for results queue to drain 13355 1727096196.92953: waiting for pending results... 13355 1727096196.93389: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096196.93787: in run() - task 0afff68d-5257-c514-593f-000000000130 13355 1727096196.93811: variable 'ansible_search_path' from source: unknown 13355 1727096196.93820: variable 'ansible_search_path' from source: unknown 13355 1727096196.93872: calling self._execute() 13355 1727096196.94185: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.94198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.94214: variable 'omit' from source: magic vars 13355 1727096196.95038: variable 'ansible_distribution_major_version' from source: facts 13355 1727096196.95058: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096196.95071: variable 'omit' from source: magic vars 13355 1727096196.95273: variable 'omit' from source: magic vars 13355 1727096196.95310: variable 'omit' from source: magic vars 13355 1727096196.95525: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096196.95529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096196.95532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096196.95675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096196.95678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096196.95701: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096196.95710: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.95717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.95939: Set connection var ansible_shell_executable to /bin/sh 13355 1727096196.95953: Set connection var ansible_shell_type to sh 13355 1727096196.95970: Set connection var ansible_pipelining to False 13355 1727096196.96005: Set connection var ansible_connection to ssh 13355 1727096196.96014: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096196.96023: Set connection var ansible_timeout to 10 13355 1727096196.96099: variable 'ansible_shell_executable' from source: unknown 13355 1727096196.96113: variable 'ansible_connection' from source: unknown 13355 1727096196.96120: variable 'ansible_module_compression' from source: unknown 13355 1727096196.96126: variable 'ansible_shell_type' from source: unknown 13355 1727096196.96132: variable 'ansible_shell_executable' from source: unknown 13355 1727096196.96138: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096196.96275: variable 'ansible_pipelining' from source: unknown 13355 1727096196.96278: variable 'ansible_timeout' from source: unknown 13355 1727096196.96281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096196.96875: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096196.96880: variable 'omit' from source: magic vars 13355 1727096196.96882: starting attempt loop 13355 1727096196.96884: running the handler 13355 1727096196.96887: _low_level_execute_command(): starting 13355 1727096196.96889: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096196.98192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096196.98590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.00090: stdout chunk (state=3): >>>/root <<< 13355 1727096197.00179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.00217: stderr chunk (state=3): >>><<< 13355 1727096197.00226: stdout chunk (state=3): >>><<< 13355 1727096197.00495: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096197.00499: _low_level_execute_command(): starting 13355 1727096197.00502: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588 `" && echo ansible-tmp-1727096197.003918-15360-175373055430588="` echo /root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588 `" ) && sleep 0' 13355 1727096197.01585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096197.01601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.01670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096197.01785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.01904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.04375: stdout chunk (state=3): >>>ansible-tmp-1727096197.003918-15360-175373055430588=/root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588 <<< 13355 1727096197.04380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.04383: stdout chunk (state=3): >>><<< 13355 1727096197.04386: stderr chunk (state=3): >>><<< 13355 1727096197.04389: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096197.003918-15360-175373055430588=/root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096197.04392: variable 'ansible_module_compression' from source: unknown 13355 1727096197.04394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13355 1727096197.04397: variable 'ansible_facts' from source: unknown 13355 1727096197.04876: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/AnsiballZ_ping.py 13355 1727096197.05313: Sending initial data 13355 1727096197.05325: Sent initial data (152 bytes) 13355 1727096197.06394: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.06431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096197.06451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.06603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.06891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.08418: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096197.08446: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096197.08500: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpf3aqynzw /root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/AnsiballZ_ping.py <<< 13355 1727096197.08514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/AnsiballZ_ping.py" <<< 13355 1727096197.08539: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpf3aqynzw" to remote "/root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/AnsiballZ_ping.py" <<< 13355 1727096197.09975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.09979: stdout chunk (state=3): >>><<< 13355 1727096197.09982: stderr chunk (state=3): >>><<< 13355 1727096197.09984: done transferring module to remote 13355 1727096197.09986: _low_level_execute_command(): starting 13355 1727096197.09989: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/ /root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/AnsiballZ_ping.py && sleep 0' 13355 1727096197.11393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.11601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096197.11616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.11665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.13587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.13628: stdout chunk (state=3): >>><<< 13355 1727096197.13631: stderr chunk (state=3): >>><<< 13355 1727096197.13673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096197.13682: _low_level_execute_command(): starting 13355 1727096197.13685: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/AnsiballZ_ping.py && sleep 0' 13355 1727096197.14349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096197.14375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096197.14549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.14694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.14716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.14792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.30097: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13355 1727096197.31548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096197.31579: stdout chunk (state=3): >>><<< 13355 1727096197.31719: stderr chunk (state=3): >>><<< 13355 1727096197.31723: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096197.31726: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096197.31728: _low_level_execute_command(): starting 13355 1727096197.31731: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096197.003918-15360-175373055430588/ > /dev/null 2>&1 && sleep 0' 13355 1727096197.32378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096197.32398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.32410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.32677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.34476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.34481: stdout chunk (state=3): >>><<< 13355 1727096197.34483: stderr chunk (state=3): >>><<< 13355 1727096197.34486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096197.34488: handler run complete 13355 1727096197.34490: attempt loop complete, returning result 13355 1727096197.34492: _execute() done 13355 1727096197.34494: dumping result to json 13355 1727096197.34495: done dumping result, returning 13355 1727096197.34497: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-c514-593f-000000000130] 13355 1727096197.34499: sending task result for task 0afff68d-5257-c514-593f-000000000130 ok: [managed_node3] => { "changed": false, "ping": "pong" } 13355 1727096197.34750: no more pending results, returning what we have 13355 1727096197.34754: results queue empty 13355 1727096197.34755: checking for any_errors_fatal 13355 1727096197.34765: done checking for any_errors_fatal 13355 1727096197.34766: checking for max_fail_percentage 13355 1727096197.34775: done checking for max_fail_percentage 13355 1727096197.34776: checking to see if all hosts have failed and the running result is not ok 13355 1727096197.34777: done checking to see if all hosts have failed 13355 1727096197.34778: getting the remaining hosts for this loop 13355 1727096197.34779: done getting the remaining hosts for this loop 13355 1727096197.34783: getting the next task for host managed_node3 13355 1727096197.34794: done getting next task for host managed_node3 13355 1727096197.34797: ^ task is: TASK: meta (role_complete) 13355 1727096197.34800: ^ state is: HOST STATE: block=2, task=30, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096197.34813: getting variables 13355 1727096197.34815: in VariableManager get_vars() 13355 1727096197.35072: Calling all_inventory to load vars for managed_node3 13355 1727096197.35075: Calling groups_inventory to load vars for managed_node3 13355 1727096197.35077: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096197.35084: done sending task result for task 0afff68d-5257-c514-593f-000000000130 13355 1727096197.35086: WORKER PROCESS EXITING 13355 1727096197.35095: Calling all_plugins_play to load vars for managed_node3 13355 1727096197.35098: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096197.35100: Calling groups_plugins_play to load vars for managed_node3 13355 1727096197.42278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096197.43738: done with get_vars() 13355 1727096197.43817: done getting variables 13355 1727096197.43891: done queuing things up, now waiting for results queue to drain 13355 1727096197.43894: results queue empty 13355 1727096197.43895: checking for any_errors_fatal 13355 1727096197.43898: done checking for any_errors_fatal 13355 1727096197.43899: checking for max_fail_percentage 13355 1727096197.43900: done checking for max_fail_percentage 13355 1727096197.43901: checking to see if all hosts have failed and the running result is not ok 13355 1727096197.43902: done checking to see if all hosts have failed 13355 1727096197.43903: getting the remaining hosts for this loop 13355 1727096197.43905: done getting the remaining hosts for this loop 13355 1727096197.43908: getting the next task for host managed_node3 13355 1727096197.43912: done getting next task for host managed_node3 13355 1727096197.43914: ^ task is: TASK: From the active connection, get the controller profile "{{ controller_profile }}" 13355 1727096197.43916: ^ state is: HOST STATE: block=2, task=31, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096197.43918: getting variables 13355 1727096197.43919: in VariableManager get_vars() 13355 1727096197.43943: Calling all_inventory to load vars for managed_node3 13355 1727096197.43946: Calling groups_inventory to load vars for managed_node3 13355 1727096197.43948: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096197.43953: Calling all_plugins_play to load vars for managed_node3 13355 1727096197.43955: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096197.43958: Calling groups_plugins_play to load vars for managed_node3 13355 1727096197.45080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096197.46976: done with get_vars() 13355 1727096197.47004: done getting variables 13355 1727096197.47085: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096197.47245: variable 'controller_profile' from source: play vars TASK [From the active connection, get the controller profile "bond0"] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:200 Monday 23 September 2024 08:56:37 -0400 (0:00:00.551) 0:00:46.733 ****** 13355 1727096197.47272: entering _queue_task() for managed_node3/command 13355 1727096197.47623: worker is 1 (out of 1 available) 13355 1727096197.47636: exiting _queue_task() for managed_node3/command 13355 1727096197.47650: done queuing things up, now waiting for results queue to drain 13355 1727096197.47651: waiting for pending results... 13355 1727096197.47937: running TaskExecutor() for managed_node3/TASK: From the active connection, get the controller profile "bond0" 13355 1727096197.48047: in run() - task 0afff68d-5257-c514-593f-000000000160 13355 1727096197.48071: variable 'ansible_search_path' from source: unknown 13355 1727096197.48117: calling self._execute() 13355 1727096197.48221: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096197.48233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096197.48273: variable 'omit' from source: magic vars 13355 1727096197.48637: variable 'ansible_distribution_major_version' from source: facts 13355 1727096197.48655: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096197.48778: variable 'network_provider' from source: set_fact 13355 1727096197.48852: Evaluated conditional (network_provider == "nm"): True 13355 1727096197.48855: variable 'omit' from source: magic vars 13355 1727096197.48858: variable 'omit' from source: magic vars 13355 1727096197.48919: variable 'controller_profile' from source: play vars 13355 1727096197.48942: variable 'omit' from source: magic vars 13355 1727096197.48994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096197.49096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096197.49573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096197.49577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096197.49580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096197.49582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096197.49584: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096197.49586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096197.49588: Set connection var ansible_shell_executable to /bin/sh 13355 1727096197.49590: Set connection var ansible_shell_type to sh 13355 1727096197.49592: Set connection var ansible_pipelining to False 13355 1727096197.49594: Set connection var ansible_connection to ssh 13355 1727096197.49596: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096197.49598: Set connection var ansible_timeout to 10 13355 1727096197.49600: variable 'ansible_shell_executable' from source: unknown 13355 1727096197.49603: variable 'ansible_connection' from source: unknown 13355 1727096197.49604: variable 'ansible_module_compression' from source: unknown 13355 1727096197.49606: variable 'ansible_shell_type' from source: unknown 13355 1727096197.49608: variable 'ansible_shell_executable' from source: unknown 13355 1727096197.49610: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096197.49612: variable 'ansible_pipelining' from source: unknown 13355 1727096197.49614: variable 'ansible_timeout' from source: unknown 13355 1727096197.49616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096197.50085: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096197.50104: variable 'omit' from source: magic vars 13355 1727096197.50114: starting attempt loop 13355 1727096197.50120: running the handler 13355 1727096197.50141: _low_level_execute_command(): starting 13355 1727096197.50184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096197.51180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.51245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096197.51272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.51347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.51441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.53118: stdout chunk (state=3): >>>/root <<< 13355 1727096197.53259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.53271: stdout chunk (state=3): >>><<< 13355 1727096197.53385: stderr chunk (state=3): >>><<< 13355 1727096197.53411: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096197.53425: _low_level_execute_command(): starting 13355 1727096197.53431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732 `" && echo ansible-tmp-1727096197.5341198-15393-43902139786732="` echo /root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732 `" ) && sleep 0' 13355 1727096197.54736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.54779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096197.54785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.54845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.54972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.56954: stdout chunk (state=3): >>>ansible-tmp-1727096197.5341198-15393-43902139786732=/root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732 <<< 13355 1727096197.57334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.57338: stdout chunk (state=3): >>><<< 13355 1727096197.57340: stderr chunk (state=3): >>><<< 13355 1727096197.57343: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096197.5341198-15393-43902139786732=/root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096197.57346: variable 'ansible_module_compression' from source: unknown 13355 1727096197.57348: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096197.57478: variable 'ansible_facts' from source: unknown 13355 1727096197.57636: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/AnsiballZ_command.py 13355 1727096197.57896: Sending initial data 13355 1727096197.57899: Sent initial data (155 bytes) 13355 1727096197.58766: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096197.58813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.58926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.58978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096197.59046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.59164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.59196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.60850: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096197.60903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096197.60947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/AnsiballZ_command.py" <<< 13355 1727096197.60978: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp15424gi8 /root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/AnsiballZ_command.py <<< 13355 1727096197.60997: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp15424gi8" to remote "/root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/AnsiballZ_command.py" <<< 13355 1727096197.62477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.62481: stdout chunk (state=3): >>><<< 13355 1727096197.62484: stderr chunk (state=3): >>><<< 13355 1727096197.62486: done transferring module to remote 13355 1727096197.62488: _low_level_execute_command(): starting 13355 1727096197.62490: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/ /root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/AnsiballZ_command.py && sleep 0' 13355 1727096197.63185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096197.63190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.63210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096197.63213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.63294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.63298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.63359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.65507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.65511: stdout chunk (state=3): >>><<< 13355 1727096197.65514: stderr chunk (state=3): >>><<< 13355 1727096197.65517: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096197.65519: _low_level_execute_command(): starting 13355 1727096197.65522: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/AnsiballZ_command.py && sleep 0' 13355 1727096197.66604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096197.66778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096197.66782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096197.66784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096197.66787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096197.66789: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096197.66791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.66794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096197.66796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096197.66798: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096197.66884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096197.66997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.67021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.67109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.85132: stdout chunk (state=3): >>> {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: a55cc535-e38f-4547-bb0f-3479e284a0c7\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1727096188\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\<<< 13355 1727096197.85171: stdout chunk (state=3): >>>nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: a55cc535-e38f-4547-bb0f-3479e284a0c7\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/28\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/23\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.223/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:fa:71:61:85:c3:c6\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1727096428\nDHCP4.OPTION[7]: host_name = ip-10-31-14-152\nDHCP4.OPTION[8]: ip_address = 192.0.2.223\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::f5/128\nIP6.ADDRESS[2]: 2001:db8::f871:61ff:fe85:c3c6/64\nIP6.ADDRESS[3]: fe80::f871:61ff:fe85:c3c6/64\nIP6.GATEWAY: fe80::8846:a3ff:fea6:9457\nIP6.ROUTE[1]: dst = 2001:db8::f5/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::8846:a3ff:fea6:9457, mt = 300\nIP6.DNS[1]: 2001:db8::98be:d7ff:fe0d:a8a3\nIP6.DNS[2]: fe80::8846:a3ff:fea6:9457\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:f8:bb:81:64:92:68:84:8e:b8:30:57:ee:ce:2d:ea:a7\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::98be:d7ff:fe0d:a8a3\nDHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-14-152\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::f5", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-23 08:56:37.828723", "end": "2024-09-23 08:56:37.847966", "delta": "0:00:00.019243", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096197.86975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096197.86980: stdout chunk (state=3): >>><<< 13355 1727096197.86982: stderr chunk (state=3): >>><<< 13355 1727096197.86985: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "connection.id: bond0\nconnection.uuid: a55cc535-e38f-4547-bb0f-3479e284a0c7\nconnection.stable-id: --\nconnection.type: bond\nconnection.interface-name: nm-bond\nconnection.autoconnect: yes\nconnection.autoconnect-priority: 0\nconnection.autoconnect-retries: -1 (default)\nconnection.multi-connect: 0 (default)\nconnection.auth-retries: -1\nconnection.timestamp: 1727096188\nconnection.permissions: --\nconnection.zone: --\nconnection.controller: --\nconnection.master: --\nconnection.slave-type: --\nconnection.port-type: --\nconnection.autoconnect-slaves: -1 (default)\nconnection.autoconnect-ports: -1 (default)\nconnection.down-on-poweroff: -1 (default)\nconnection.secondaries: --\nconnection.gateway-ping-timeout: 0\nconnection.metered: unknown\nconnection.lldp: default\nconnection.mdns: -1 (default)\nconnection.llmnr: -1 (default)\nconnection.dns-over-tls: -1 (default)\nconnection.mptcp-flags: 0x0 (default)\nconnection.wait-device-timeout: -1\nconnection.wait-activation-delay: -1\nipv4.method: auto\nipv4.dns: --\nipv4.dns-search: --\nipv4.dns-options: --\nipv4.dns-priority: 0\nipv4.addresses: --\nipv4.gateway: --\nipv4.routes: --\nipv4.route-metric: 65535\nipv4.route-table: 0 (unspec)\nipv4.routing-rules: --\nipv4.replace-local-rule: -1 (default)\nipv4.dhcp-send-release: -1 (default)\nipv4.ignore-auto-routes: no\nipv4.ignore-auto-dns: no\nipv4.dhcp-client-id: --\nipv4.dhcp-iaid: --\nipv4.dhcp-dscp: --\nipv4.dhcp-timeout: 0 (default)\nipv4.dhcp-send-hostname: yes\nipv4.dhcp-hostname: --\nipv4.dhcp-fqdn: --\nipv4.dhcp-hostname-flags: 0x0 (none)\nipv4.never-default: no\nipv4.may-fail: yes\nipv4.required-timeout: -1 (default)\nipv4.dad-timeout: -1 (default)\nipv4.dhcp-vendor-class-identifier: --\nipv4.link-local: 0 (default)\nipv4.dhcp-reject-servers: --\nipv4.auto-route-ext-gw: -1 (default)\nipv6.method: auto\nipv6.dns: --\nipv6.dns-search: --\nipv6.dns-options: --\nipv6.dns-priority: 0\nipv6.addresses: --\nipv6.gateway: --\nipv6.routes: --\nipv6.route-metric: -1\nipv6.route-table: 0 (unspec)\nipv6.routing-rules: --\nipv6.replace-local-rule: -1 (default)\nipv6.dhcp-send-release: -1 (default)\nipv6.ignore-auto-routes: no\nipv6.ignore-auto-dns: no\nipv6.never-default: no\nipv6.may-fail: yes\nipv6.required-timeout: -1 (default)\nipv6.ip6-privacy: -1 (default)\nipv6.temp-valid-lifetime: 0 (default)\nipv6.temp-preferred-lifetime: 0 (default)\nipv6.addr-gen-mode: default\nipv6.ra-timeout: 0 (default)\nipv6.mtu: auto\nipv6.dhcp-pd-hint: --\nipv6.dhcp-duid: --\nipv6.dhcp-iaid: --\nipv6.dhcp-timeout: 0 (default)\nipv6.dhcp-send-hostname: yes\nipv6.dhcp-hostname: --\nipv6.dhcp-hostname-flags: 0x0 (none)\nipv6.auto-route-ext-gw: -1 (default)\nipv6.token: --\nbond.options: mode=active-backup,miimon=110\nproxy.method: none\nproxy.browser-only: no\nproxy.pac-url: --\nproxy.pac-script: --\nGENERAL.NAME: bond0\nGENERAL.UUID: a55cc535-e38f-4547-bb0f-3479e284a0c7\nGENERAL.DEVICES: nm-bond\nGENERAL.IP-IFACE: nm-bond\nGENERAL.STATE: activated\nGENERAL.DEFAULT: no\nGENERAL.DEFAULT6: yes\nGENERAL.SPEC-OBJECT: --\nGENERAL.VPN: no\nGENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/28\nGENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/23\nGENERAL.ZONE: --\nGENERAL.MASTER-PATH: --\nIP4.ADDRESS[1]: 192.0.2.223/24\nIP4.GATEWAY: 192.0.2.1\nIP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535\nIP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535\nIP4.DNS[1]: 192.0.2.1\nDHCP4.OPTION[1]: broadcast_address = 192.0.2.255\nDHCP4.OPTION[2]: dhcp_client_identifier = 01:fa:71:61:85:c3:c6\nDHCP4.OPTION[3]: dhcp_lease_time = 240\nDHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1\nDHCP4.OPTION[5]: domain_name_servers = 192.0.2.1\nDHCP4.OPTION[6]: expiry = 1727096428\nDHCP4.OPTION[7]: host_name = ip-10-31-14-152\nDHCP4.OPTION[8]: ip_address = 192.0.2.223\nDHCP4.OPTION[9]: next_server = 192.0.2.1\nDHCP4.OPTION[10]: requested_broadcast_address = 1\nDHCP4.OPTION[11]: requested_domain_name = 1\nDHCP4.OPTION[12]: requested_domain_name_servers = 1\nDHCP4.OPTION[13]: requested_domain_search = 1\nDHCP4.OPTION[14]: requested_host_name = 1\nDHCP4.OPTION[15]: requested_interface_mtu = 1\nDHCP4.OPTION[16]: requested_ms_classless_static_routes = 1\nDHCP4.OPTION[17]: requested_nis_domain = 1\nDHCP4.OPTION[18]: requested_nis_servers = 1\nDHCP4.OPTION[19]: requested_ntp_servers = 1\nDHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1\nDHCP4.OPTION[21]: requested_root_path = 1\nDHCP4.OPTION[22]: requested_routers = 1\nDHCP4.OPTION[23]: requested_static_routes = 1\nDHCP4.OPTION[24]: requested_subnet_mask = 1\nDHCP4.OPTION[25]: requested_time_offset = 1\nDHCP4.OPTION[26]: requested_wpad = 1\nDHCP4.OPTION[27]: routers = 192.0.2.1\nDHCP4.OPTION[28]: subnet_mask = 255.255.255.0\nIP6.ADDRESS[1]: 2001:db8::f5/128\nIP6.ADDRESS[2]: 2001:db8::f871:61ff:fe85:c3c6/64\nIP6.ADDRESS[3]: fe80::f871:61ff:fe85:c3c6/64\nIP6.GATEWAY: fe80::8846:a3ff:fea6:9457\nIP6.ROUTE[1]: dst = 2001:db8::f5/128, nh = ::, mt = 300\nIP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300\nIP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024\nIP6.ROUTE[4]: dst = ::/0, nh = fe80::8846:a3ff:fea6:9457, mt = 300\nIP6.DNS[1]: 2001:db8::98be:d7ff:fe0d:a8a3\nIP6.DNS[2]: fe80::8846:a3ff:fea6:9457\nDHCP6.OPTION[1]: dhcp6_client_id = 00:04:f8:bb:81:64:92:68:84:8e:b8:30:57:ee:ce:2d:ea:a7\nDHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::98be:d7ff:fe0d:a8a3\nDHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-14-152\nDHCP6.OPTION[4]: iaid = 8c:3b:13:c0\nDHCP6.OPTION[5]: ip6_address = 2001:db8::f5", "stderr": "", "rc": 0, "cmd": ["nmcli", "c", "show", "--active", "bond0"], "start": "2024-09-23 08:56:37.828723", "end": "2024-09-23 08:56:37.847966", "delta": "0:00:00.019243", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli c show --active bond0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096197.87006: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli c show --active bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096197.87013: _low_level_execute_command(): starting 13355 1727096197.87019: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096197.5341198-15393-43902139786732/ > /dev/null 2>&1 && sleep 0' 13355 1727096197.87665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096197.87672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096197.87684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096197.87697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096197.87708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096197.87715: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096197.87723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096197.87817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096197.87876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096197.87906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096197.89790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096197.90074: stderr chunk (state=3): >>><<< 13355 1727096197.90079: stdout chunk (state=3): >>><<< 13355 1727096197.90083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096197.90086: handler run complete 13355 1727096197.90089: Evaluated conditional (False): False 13355 1727096197.90091: attempt loop complete, returning result 13355 1727096197.90094: _execute() done 13355 1727096197.90096: dumping result to json 13355 1727096197.90099: done dumping result, returning 13355 1727096197.90101: done running TaskExecutor() for managed_node3/TASK: From the active connection, get the controller profile "bond0" [0afff68d-5257-c514-593f-000000000160] 13355 1727096197.90104: sending task result for task 0afff68d-5257-c514-593f-000000000160 13355 1727096197.90195: done sending task result for task 0afff68d-5257-c514-593f-000000000160 13355 1727096197.90198: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "nmcli", "c", "show", "--active", "bond0" ], "delta": "0:00:00.019243", "end": "2024-09-23 08:56:37.847966", "rc": 0, "start": "2024-09-23 08:56:37.828723" } STDOUT: connection.id: bond0 connection.uuid: a55cc535-e38f-4547-bb0f-3479e284a0c7 connection.stable-id: -- connection.type: bond connection.interface-name: nm-bond connection.autoconnect: yes connection.autoconnect-priority: 0 connection.autoconnect-retries: -1 (default) connection.multi-connect: 0 (default) connection.auth-retries: -1 connection.timestamp: 1727096188 connection.permissions: -- connection.zone: -- connection.controller: -- connection.master: -- connection.slave-type: -- connection.port-type: -- connection.autoconnect-slaves: -1 (default) connection.autoconnect-ports: -1 (default) connection.down-on-poweroff: -1 (default) connection.secondaries: -- connection.gateway-ping-timeout: 0 connection.metered: unknown connection.lldp: default connection.mdns: -1 (default) connection.llmnr: -1 (default) connection.dns-over-tls: -1 (default) connection.mptcp-flags: 0x0 (default) connection.wait-device-timeout: -1 connection.wait-activation-delay: -1 ipv4.method: auto ipv4.dns: -- ipv4.dns-search: -- ipv4.dns-options: -- ipv4.dns-priority: 0 ipv4.addresses: -- ipv4.gateway: -- ipv4.routes: -- ipv4.route-metric: 65535 ipv4.route-table: 0 (unspec) ipv4.routing-rules: -- ipv4.replace-local-rule: -1 (default) ipv4.dhcp-send-release: -1 (default) ipv4.ignore-auto-routes: no ipv4.ignore-auto-dns: no ipv4.dhcp-client-id: -- ipv4.dhcp-iaid: -- ipv4.dhcp-dscp: -- ipv4.dhcp-timeout: 0 (default) ipv4.dhcp-send-hostname: yes ipv4.dhcp-hostname: -- ipv4.dhcp-fqdn: -- ipv4.dhcp-hostname-flags: 0x0 (none) ipv4.never-default: no ipv4.may-fail: yes ipv4.required-timeout: -1 (default) ipv4.dad-timeout: -1 (default) ipv4.dhcp-vendor-class-identifier: -- ipv4.link-local: 0 (default) ipv4.dhcp-reject-servers: -- ipv4.auto-route-ext-gw: -1 (default) ipv6.method: auto ipv6.dns: -- ipv6.dns-search: -- ipv6.dns-options: -- ipv6.dns-priority: 0 ipv6.addresses: -- ipv6.gateway: -- ipv6.routes: -- ipv6.route-metric: -1 ipv6.route-table: 0 (unspec) ipv6.routing-rules: -- ipv6.replace-local-rule: -1 (default) ipv6.dhcp-send-release: -1 (default) ipv6.ignore-auto-routes: no ipv6.ignore-auto-dns: no ipv6.never-default: no ipv6.may-fail: yes ipv6.required-timeout: -1 (default) ipv6.ip6-privacy: -1 (default) ipv6.temp-valid-lifetime: 0 (default) ipv6.temp-preferred-lifetime: 0 (default) ipv6.addr-gen-mode: default ipv6.ra-timeout: 0 (default) ipv6.mtu: auto ipv6.dhcp-pd-hint: -- ipv6.dhcp-duid: -- ipv6.dhcp-iaid: -- ipv6.dhcp-timeout: 0 (default) ipv6.dhcp-send-hostname: yes ipv6.dhcp-hostname: -- ipv6.dhcp-hostname-flags: 0x0 (none) ipv6.auto-route-ext-gw: -1 (default) ipv6.token: -- bond.options: mode=active-backup,miimon=110 proxy.method: none proxy.browser-only: no proxy.pac-url: -- proxy.pac-script: -- GENERAL.NAME: bond0 GENERAL.UUID: a55cc535-e38f-4547-bb0f-3479e284a0c7 GENERAL.DEVICES: nm-bond GENERAL.IP-IFACE: nm-bond GENERAL.STATE: activated GENERAL.DEFAULT: no GENERAL.DEFAULT6: yes GENERAL.SPEC-OBJECT: -- GENERAL.VPN: no GENERAL.DBUS-PATH: /org/freedesktop/NetworkManager/ActiveConnection/28 GENERAL.CON-PATH: /org/freedesktop/NetworkManager/Settings/23 GENERAL.ZONE: -- GENERAL.MASTER-PATH: -- IP4.ADDRESS[1]: 192.0.2.223/24 IP4.GATEWAY: 192.0.2.1 IP4.ROUTE[1]: dst = 0.0.0.0/0, nh = 192.0.2.1, mt = 65535 IP4.ROUTE[2]: dst = 192.0.2.0/24, nh = 0.0.0.0, mt = 65535 IP4.DNS[1]: 192.0.2.1 DHCP4.OPTION[1]: broadcast_address = 192.0.2.255 DHCP4.OPTION[2]: dhcp_client_identifier = 01:fa:71:61:85:c3:c6 DHCP4.OPTION[3]: dhcp_lease_time = 240 DHCP4.OPTION[4]: dhcp_server_identifier = 192.0.2.1 DHCP4.OPTION[5]: domain_name_servers = 192.0.2.1 DHCP4.OPTION[6]: expiry = 1727096428 DHCP4.OPTION[7]: host_name = ip-10-31-14-152 DHCP4.OPTION[8]: ip_address = 192.0.2.223 DHCP4.OPTION[9]: next_server = 192.0.2.1 DHCP4.OPTION[10]: requested_broadcast_address = 1 DHCP4.OPTION[11]: requested_domain_name = 1 DHCP4.OPTION[12]: requested_domain_name_servers = 1 DHCP4.OPTION[13]: requested_domain_search = 1 DHCP4.OPTION[14]: requested_host_name = 1 DHCP4.OPTION[15]: requested_interface_mtu = 1 DHCP4.OPTION[16]: requested_ms_classless_static_routes = 1 DHCP4.OPTION[17]: requested_nis_domain = 1 DHCP4.OPTION[18]: requested_nis_servers = 1 DHCP4.OPTION[19]: requested_ntp_servers = 1 DHCP4.OPTION[20]: requested_rfc3442_classless_static_routes = 1 DHCP4.OPTION[21]: requested_root_path = 1 DHCP4.OPTION[22]: requested_routers = 1 DHCP4.OPTION[23]: requested_static_routes = 1 DHCP4.OPTION[24]: requested_subnet_mask = 1 DHCP4.OPTION[25]: requested_time_offset = 1 DHCP4.OPTION[26]: requested_wpad = 1 DHCP4.OPTION[27]: routers = 192.0.2.1 DHCP4.OPTION[28]: subnet_mask = 255.255.255.0 IP6.ADDRESS[1]: 2001:db8::f5/128 IP6.ADDRESS[2]: 2001:db8::f871:61ff:fe85:c3c6/64 IP6.ADDRESS[3]: fe80::f871:61ff:fe85:c3c6/64 IP6.GATEWAY: fe80::8846:a3ff:fea6:9457 IP6.ROUTE[1]: dst = 2001:db8::f5/128, nh = ::, mt = 300 IP6.ROUTE[2]: dst = 2001:db8::/64, nh = ::, mt = 300 IP6.ROUTE[3]: dst = fe80::/64, nh = ::, mt = 1024 IP6.ROUTE[4]: dst = ::/0, nh = fe80::8846:a3ff:fea6:9457, mt = 300 IP6.DNS[1]: 2001:db8::98be:d7ff:fe0d:a8a3 IP6.DNS[2]: fe80::8846:a3ff:fea6:9457 DHCP6.OPTION[1]: dhcp6_client_id = 00:04:f8:bb:81:64:92:68:84:8e:b8:30:57:ee:ce:2d:ea:a7 DHCP6.OPTION[2]: dhcp6_name_servers = 2001:db8::98be:d7ff:fe0d:a8a3 DHCP6.OPTION[3]: fqdn_fqdn = ip-10-31-14-152 DHCP6.OPTION[4]: iaid = 8c:3b:13:c0 DHCP6.OPTION[5]: ip6_address = 2001:db8::f5 13355 1727096197.90356: no more pending results, returning what we have 13355 1727096197.90361: results queue empty 13355 1727096197.90362: checking for any_errors_fatal 13355 1727096197.90364: done checking for any_errors_fatal 13355 1727096197.90365: checking for max_fail_percentage 13355 1727096197.90369: done checking for max_fail_percentage 13355 1727096197.90370: checking to see if all hosts have failed and the running result is not ok 13355 1727096197.90371: done checking to see if all hosts have failed 13355 1727096197.90371: getting the remaining hosts for this loop 13355 1727096197.90373: done getting the remaining hosts for this loop 13355 1727096197.90377: getting the next task for host managed_node3 13355 1727096197.90384: done getting next task for host managed_node3 13355 1727096197.90387: ^ task is: TASK: Assert that the controller profile is activated 13355 1727096197.90389: ^ state is: HOST STATE: block=2, task=32, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096197.90393: getting variables 13355 1727096197.90395: in VariableManager get_vars() 13355 1727096197.90452: Calling all_inventory to load vars for managed_node3 13355 1727096197.90455: Calling groups_inventory to load vars for managed_node3 13355 1727096197.90458: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096197.90574: Calling all_plugins_play to load vars for managed_node3 13355 1727096197.90583: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096197.90588: Calling groups_plugins_play to load vars for managed_node3 13355 1727096197.92271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096197.93911: done with get_vars() 13355 1727096197.93939: done getting variables 13355 1727096197.94005: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:207 Monday 23 September 2024 08:56:37 -0400 (0:00:00.467) 0:00:47.200 ****** 13355 1727096197.94033: entering _queue_task() for managed_node3/assert 13355 1727096197.94521: worker is 1 (out of 1 available) 13355 1727096197.94533: exiting _queue_task() for managed_node3/assert 13355 1727096197.94546: done queuing things up, now waiting for results queue to drain 13355 1727096197.94547: waiting for pending results... 13355 1727096197.94749: running TaskExecutor() for managed_node3/TASK: Assert that the controller profile is activated 13355 1727096197.94851: in run() - task 0afff68d-5257-c514-593f-000000000161 13355 1727096197.94865: variable 'ansible_search_path' from source: unknown 13355 1727096197.94914: calling self._execute() 13355 1727096197.95028: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096197.95035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096197.95050: variable 'omit' from source: magic vars 13355 1727096197.95471: variable 'ansible_distribution_major_version' from source: facts 13355 1727096197.95488: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096197.95612: variable 'network_provider' from source: set_fact 13355 1727096197.95616: Evaluated conditional (network_provider == "nm"): True 13355 1727096197.95638: variable 'omit' from source: magic vars 13355 1727096197.95651: variable 'omit' from source: magic vars 13355 1727096197.95808: variable 'controller_profile' from source: play vars 13355 1727096197.95812: variable 'omit' from source: magic vars 13355 1727096197.95824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096197.95873: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096197.95886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096197.95904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096197.95921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096197.96115: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096197.96119: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096197.96121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096197.96124: Set connection var ansible_shell_executable to /bin/sh 13355 1727096197.96127: Set connection var ansible_shell_type to sh 13355 1727096197.96129: Set connection var ansible_pipelining to False 13355 1727096197.96131: Set connection var ansible_connection to ssh 13355 1727096197.96133: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096197.96135: Set connection var ansible_timeout to 10 13355 1727096197.96138: variable 'ansible_shell_executable' from source: unknown 13355 1727096197.96142: variable 'ansible_connection' from source: unknown 13355 1727096197.96144: variable 'ansible_module_compression' from source: unknown 13355 1727096197.96146: variable 'ansible_shell_type' from source: unknown 13355 1727096197.96148: variable 'ansible_shell_executable' from source: unknown 13355 1727096197.96150: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096197.96152: variable 'ansible_pipelining' from source: unknown 13355 1727096197.96154: variable 'ansible_timeout' from source: unknown 13355 1727096197.96159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096197.96574: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096197.96578: variable 'omit' from source: magic vars 13355 1727096197.96581: starting attempt loop 13355 1727096197.96583: running the handler 13355 1727096197.96586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096197.98765: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096197.98831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096197.98891: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096197.98926: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096197.98952: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096197.99040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096197.99064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096197.99104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096197.99144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096197.99162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096197.99274: variable 'active_controller_profile' from source: set_fact 13355 1727096197.99312: Evaluated conditional (active_controller_profile.stdout | length != 0): True 13355 1727096197.99319: handler run complete 13355 1727096197.99334: attempt loop complete, returning result 13355 1727096197.99337: _execute() done 13355 1727096197.99340: dumping result to json 13355 1727096197.99343: done dumping result, returning 13355 1727096197.99352: done running TaskExecutor() for managed_node3/TASK: Assert that the controller profile is activated [0afff68d-5257-c514-593f-000000000161] 13355 1727096197.99358: sending task result for task 0afff68d-5257-c514-593f-000000000161 13355 1727096197.99451: done sending task result for task 0afff68d-5257-c514-593f-000000000161 13355 1727096197.99455: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 13355 1727096197.99514: no more pending results, returning what we have 13355 1727096197.99518: results queue empty 13355 1727096197.99518: checking for any_errors_fatal 13355 1727096197.99531: done checking for any_errors_fatal 13355 1727096197.99532: checking for max_fail_percentage 13355 1727096197.99534: done checking for max_fail_percentage 13355 1727096197.99535: checking to see if all hosts have failed and the running result is not ok 13355 1727096197.99536: done checking to see if all hosts have failed 13355 1727096197.99537: getting the remaining hosts for this loop 13355 1727096197.99538: done getting the remaining hosts for this loop 13355 1727096197.99542: getting the next task for host managed_node3 13355 1727096197.99549: done getting next task for host managed_node3 13355 1727096197.99552: ^ task is: TASK: Get the controller device details 13355 1727096197.99554: ^ state is: HOST STATE: block=2, task=33, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096197.99559: getting variables 13355 1727096197.99569: in VariableManager get_vars() 13355 1727096197.99632: Calling all_inventory to load vars for managed_node3 13355 1727096197.99635: Calling groups_inventory to load vars for managed_node3 13355 1727096197.99638: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096197.99650: Calling all_plugins_play to load vars for managed_node3 13355 1727096197.99654: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096197.99657: Calling groups_plugins_play to load vars for managed_node3 13355 1727096198.01364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096198.03123: done with get_vars() 13355 1727096198.03156: done getting variables 13355 1727096198.03214: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the controller device details] *************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:214 Monday 23 September 2024 08:56:38 -0400 (0:00:00.092) 0:00:47.293 ****** 13355 1727096198.03241: entering _queue_task() for managed_node3/command 13355 1727096198.03726: worker is 1 (out of 1 available) 13355 1727096198.03738: exiting _queue_task() for managed_node3/command 13355 1727096198.03749: done queuing things up, now waiting for results queue to drain 13355 1727096198.03751: waiting for pending results... 13355 1727096198.04087: running TaskExecutor() for managed_node3/TASK: Get the controller device details 13355 1727096198.04093: in run() - task 0afff68d-5257-c514-593f-000000000162 13355 1727096198.04098: variable 'ansible_search_path' from source: unknown 13355 1727096198.04120: calling self._execute() 13355 1727096198.04238: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.04244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.04254: variable 'omit' from source: magic vars 13355 1727096198.04667: variable 'ansible_distribution_major_version' from source: facts 13355 1727096198.04683: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096198.04802: variable 'network_provider' from source: set_fact 13355 1727096198.04976: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096198.04979: when evaluation is False, skipping this task 13355 1727096198.04981: _execute() done 13355 1727096198.04983: dumping result to json 13355 1727096198.04985: done dumping result, returning 13355 1727096198.04987: done running TaskExecutor() for managed_node3/TASK: Get the controller device details [0afff68d-5257-c514-593f-000000000162] 13355 1727096198.04989: sending task result for task 0afff68d-5257-c514-593f-000000000162 13355 1727096198.05058: done sending task result for task 0afff68d-5257-c514-593f-000000000162 13355 1727096198.05062: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096198.05262: no more pending results, returning what we have 13355 1727096198.05265: results queue empty 13355 1727096198.05266: checking for any_errors_fatal 13355 1727096198.05274: done checking for any_errors_fatal 13355 1727096198.05275: checking for max_fail_percentage 13355 1727096198.05276: done checking for max_fail_percentage 13355 1727096198.05277: checking to see if all hosts have failed and the running result is not ok 13355 1727096198.05278: done checking to see if all hosts have failed 13355 1727096198.05279: getting the remaining hosts for this loop 13355 1727096198.05280: done getting the remaining hosts for this loop 13355 1727096198.05284: getting the next task for host managed_node3 13355 1727096198.05289: done getting next task for host managed_node3 13355 1727096198.05292: ^ task is: TASK: Assert that the controller profile is activated 13355 1727096198.05294: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096198.05298: getting variables 13355 1727096198.05299: in VariableManager get_vars() 13355 1727096198.05354: Calling all_inventory to load vars for managed_node3 13355 1727096198.05357: Calling groups_inventory to load vars for managed_node3 13355 1727096198.05359: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096198.05371: Calling all_plugins_play to load vars for managed_node3 13355 1727096198.05374: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096198.05378: Calling groups_plugins_play to load vars for managed_node3 13355 1727096198.06687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096198.08302: done with get_vars() 13355 1727096198.08331: done getting variables 13355 1727096198.08400: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the controller profile is activated] ************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:221 Monday 23 September 2024 08:56:38 -0400 (0:00:00.051) 0:00:47.344 ****** 13355 1727096198.08430: entering _queue_task() for managed_node3/assert 13355 1727096198.08907: worker is 1 (out of 1 available) 13355 1727096198.08920: exiting _queue_task() for managed_node3/assert 13355 1727096198.08931: done queuing things up, now waiting for results queue to drain 13355 1727096198.08933: waiting for pending results... 13355 1727096198.09163: running TaskExecutor() for managed_node3/TASK: Assert that the controller profile is activated 13355 1727096198.09285: in run() - task 0afff68d-5257-c514-593f-000000000163 13355 1727096198.09290: variable 'ansible_search_path' from source: unknown 13355 1727096198.09300: calling self._execute() 13355 1727096198.09410: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.09414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.09423: variable 'omit' from source: magic vars 13355 1727096198.09825: variable 'ansible_distribution_major_version' from source: facts 13355 1727096198.09829: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096198.09936: variable 'network_provider' from source: set_fact 13355 1727096198.10043: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096198.10046: when evaluation is False, skipping this task 13355 1727096198.10048: _execute() done 13355 1727096198.10049: dumping result to json 13355 1727096198.10052: done dumping result, returning 13355 1727096198.10054: done running TaskExecutor() for managed_node3/TASK: Assert that the controller profile is activated [0afff68d-5257-c514-593f-000000000163] 13355 1727096198.10055: sending task result for task 0afff68d-5257-c514-593f-000000000163 13355 1727096198.10126: done sending task result for task 0afff68d-5257-c514-593f-000000000163 13355 1727096198.10130: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096198.10184: no more pending results, returning what we have 13355 1727096198.10188: results queue empty 13355 1727096198.10189: checking for any_errors_fatal 13355 1727096198.10202: done checking for any_errors_fatal 13355 1727096198.10203: checking for max_fail_percentage 13355 1727096198.10204: done checking for max_fail_percentage 13355 1727096198.10205: checking to see if all hosts have failed and the running result is not ok 13355 1727096198.10206: done checking to see if all hosts have failed 13355 1727096198.10207: getting the remaining hosts for this loop 13355 1727096198.10208: done getting the remaining hosts for this loop 13355 1727096198.10212: getting the next task for host managed_node3 13355 1727096198.10226: done getting next task for host managed_node3 13355 1727096198.10233: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096198.10238: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096198.10266: getting variables 13355 1727096198.10270: in VariableManager get_vars() 13355 1727096198.10503: Calling all_inventory to load vars for managed_node3 13355 1727096198.10506: Calling groups_inventory to load vars for managed_node3 13355 1727096198.10508: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096198.10517: Calling all_plugins_play to load vars for managed_node3 13355 1727096198.10521: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096198.10524: Calling groups_plugins_play to load vars for managed_node3 13355 1727096198.12091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096198.13699: done with get_vars() 13355 1727096198.13737: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Monday 23 September 2024 08:56:38 -0400 (0:00:00.054) 0:00:47.399 ****** 13355 1727096198.13857: entering _queue_task() for managed_node3/include_tasks 13355 1727096198.14249: worker is 1 (out of 1 available) 13355 1727096198.14262: exiting _queue_task() for managed_node3/include_tasks 13355 1727096198.14480: done queuing things up, now waiting for results queue to drain 13355 1727096198.14482: waiting for pending results... 13355 1727096198.14597: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13355 1727096198.14802: in run() - task 0afff68d-5257-c514-593f-00000000016c 13355 1727096198.14807: variable 'ansible_search_path' from source: unknown 13355 1727096198.14809: variable 'ansible_search_path' from source: unknown 13355 1727096198.14826: calling self._execute() 13355 1727096198.14924: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.15020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.15024: variable 'omit' from source: magic vars 13355 1727096198.15366: variable 'ansible_distribution_major_version' from source: facts 13355 1727096198.15386: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096198.15397: _execute() done 13355 1727096198.15454: dumping result to json 13355 1727096198.15457: done dumping result, returning 13355 1727096198.15466: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0afff68d-5257-c514-593f-00000000016c] 13355 1727096198.15471: sending task result for task 0afff68d-5257-c514-593f-00000000016c 13355 1727096198.15616: no more pending results, returning what we have 13355 1727096198.15621: in VariableManager get_vars() 13355 1727096198.15886: Calling all_inventory to load vars for managed_node3 13355 1727096198.15889: Calling groups_inventory to load vars for managed_node3 13355 1727096198.15891: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096198.15901: Calling all_plugins_play to load vars for managed_node3 13355 1727096198.15905: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096198.15908: Calling groups_plugins_play to load vars for managed_node3 13355 1727096198.16483: done sending task result for task 0afff68d-5257-c514-593f-00000000016c 13355 1727096198.16487: WORKER PROCESS EXITING 13355 1727096198.17321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096198.18899: done with get_vars() 13355 1727096198.18926: variable 'ansible_search_path' from source: unknown 13355 1727096198.18928: variable 'ansible_search_path' from source: unknown 13355 1727096198.18971: we have included files to process 13355 1727096198.18973: generating all_blocks data 13355 1727096198.18975: done generating all_blocks data 13355 1727096198.18980: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096198.18981: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096198.18984: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13355 1727096198.19579: done processing included file 13355 1727096198.19581: iterating over new_blocks loaded from include file 13355 1727096198.19583: in VariableManager get_vars() 13355 1727096198.19618: done with get_vars() 13355 1727096198.19619: filtering new block on tags 13355 1727096198.19649: done filtering new block on tags 13355 1727096198.19653: in VariableManager get_vars() 13355 1727096198.19693: done with get_vars() 13355 1727096198.19695: filtering new block on tags 13355 1727096198.19735: done filtering new block on tags 13355 1727096198.19737: in VariableManager get_vars() 13355 1727096198.19782: done with get_vars() 13355 1727096198.19784: filtering new block on tags 13355 1727096198.19821: done filtering new block on tags 13355 1727096198.19824: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 13355 1727096198.19830: extending task lists for all hosts with included blocks 13355 1727096198.22104: done extending task lists 13355 1727096198.22106: done processing included files 13355 1727096198.22107: results queue empty 13355 1727096198.22108: checking for any_errors_fatal 13355 1727096198.22112: done checking for any_errors_fatal 13355 1727096198.22112: checking for max_fail_percentage 13355 1727096198.22113: done checking for max_fail_percentage 13355 1727096198.22114: checking to see if all hosts have failed and the running result is not ok 13355 1727096198.22115: done checking to see if all hosts have failed 13355 1727096198.22116: getting the remaining hosts for this loop 13355 1727096198.22117: done getting the remaining hosts for this loop 13355 1727096198.22119: getting the next task for host managed_node3 13355 1727096198.22124: done getting next task for host managed_node3 13355 1727096198.22127: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096198.22130: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096198.22141: getting variables 13355 1727096198.22142: in VariableManager get_vars() 13355 1727096198.22313: Calling all_inventory to load vars for managed_node3 13355 1727096198.22316: Calling groups_inventory to load vars for managed_node3 13355 1727096198.22318: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096198.22325: Calling all_plugins_play to load vars for managed_node3 13355 1727096198.22327: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096198.22330: Calling groups_plugins_play to load vars for managed_node3 13355 1727096198.24207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096198.27527: done with get_vars() 13355 1727096198.27562: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Monday 23 September 2024 08:56:38 -0400 (0:00:00.137) 0:00:47.537 ****** 13355 1727096198.27657: entering _queue_task() for managed_node3/setup 13355 1727096198.28453: worker is 1 (out of 1 available) 13355 1727096198.28870: exiting _queue_task() for managed_node3/setup 13355 1727096198.28884: done queuing things up, now waiting for results queue to drain 13355 1727096198.28886: waiting for pending results... 13355 1727096198.29099: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13355 1727096198.29443: in run() - task 0afff68d-5257-c514-593f-000000000914 13355 1727096198.29509: variable 'ansible_search_path' from source: unknown 13355 1727096198.29517: variable 'ansible_search_path' from source: unknown 13355 1727096198.29558: calling self._execute() 13355 1727096198.29852: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.29865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.29880: variable 'omit' from source: magic vars 13355 1727096198.30627: variable 'ansible_distribution_major_version' from source: facts 13355 1727096198.30648: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096198.31254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096198.35395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096198.35592: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096198.35679: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096198.35718: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096198.35773: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096198.35864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096198.35915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096198.35945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096198.35997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096198.36018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096198.36083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096198.36111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096198.36136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096198.36182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096198.36199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096198.36416: variable '__network_required_facts' from source: role '' defaults 13355 1727096198.36509: variable 'ansible_facts' from source: unknown 13355 1727096198.37492: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13355 1727096198.37502: when evaluation is False, skipping this task 13355 1727096198.37508: _execute() done 13355 1727096198.37513: dumping result to json 13355 1727096198.37519: done dumping result, returning 13355 1727096198.37531: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0afff68d-5257-c514-593f-000000000914] 13355 1727096198.37539: sending task result for task 0afff68d-5257-c514-593f-000000000914 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096198.37819: no more pending results, returning what we have 13355 1727096198.37823: results queue empty 13355 1727096198.37824: checking for any_errors_fatal 13355 1727096198.37826: done checking for any_errors_fatal 13355 1727096198.37826: checking for max_fail_percentage 13355 1727096198.37828: done checking for max_fail_percentage 13355 1727096198.37829: checking to see if all hosts have failed and the running result is not ok 13355 1727096198.37830: done checking to see if all hosts have failed 13355 1727096198.37831: getting the remaining hosts for this loop 13355 1727096198.37833: done getting the remaining hosts for this loop 13355 1727096198.37837: getting the next task for host managed_node3 13355 1727096198.37848: done getting next task for host managed_node3 13355 1727096198.37853: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096198.37858: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096198.37883: getting variables 13355 1727096198.37885: in VariableManager get_vars() 13355 1727096198.37940: Calling all_inventory to load vars for managed_node3 13355 1727096198.37943: Calling groups_inventory to load vars for managed_node3 13355 1727096198.37945: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096198.37955: Calling all_plugins_play to load vars for managed_node3 13355 1727096198.37958: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096198.37961: Calling groups_plugins_play to load vars for managed_node3 13355 1727096198.38083: done sending task result for task 0afff68d-5257-c514-593f-000000000914 13355 1727096198.38087: WORKER PROCESS EXITING 13355 1727096198.40301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096198.43154: done with get_vars() 13355 1727096198.43182: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Monday 23 September 2024 08:56:38 -0400 (0:00:00.156) 0:00:47.693 ****** 13355 1727096198.43293: entering _queue_task() for managed_node3/stat 13355 1727096198.43734: worker is 1 (out of 1 available) 13355 1727096198.43748: exiting _queue_task() for managed_node3/stat 13355 1727096198.43762: done queuing things up, now waiting for results queue to drain 13355 1727096198.43764: waiting for pending results... 13355 1727096198.44076: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 13355 1727096198.44256: in run() - task 0afff68d-5257-c514-593f-000000000916 13355 1727096198.44280: variable 'ansible_search_path' from source: unknown 13355 1727096198.44288: variable 'ansible_search_path' from source: unknown 13355 1727096198.44333: calling self._execute() 13355 1727096198.44522: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.44525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.44528: variable 'omit' from source: magic vars 13355 1727096198.45175: variable 'ansible_distribution_major_version' from source: facts 13355 1727096198.45179: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096198.45298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096198.45564: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096198.45617: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096198.45651: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096198.45687: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096198.45776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096198.45807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096198.45843: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096198.45875: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096198.45976: variable '__network_is_ostree' from source: set_fact 13355 1727096198.45988: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096198.46048: when evaluation is False, skipping this task 13355 1727096198.46056: _execute() done 13355 1727096198.46064: dumping result to json 13355 1727096198.46074: done dumping result, returning 13355 1727096198.46086: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0afff68d-5257-c514-593f-000000000916] 13355 1727096198.46095: sending task result for task 0afff68d-5257-c514-593f-000000000916 13355 1727096198.46428: done sending task result for task 0afff68d-5257-c514-593f-000000000916 13355 1727096198.46431: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096198.46489: no more pending results, returning what we have 13355 1727096198.46493: results queue empty 13355 1727096198.46494: checking for any_errors_fatal 13355 1727096198.46501: done checking for any_errors_fatal 13355 1727096198.46502: checking for max_fail_percentage 13355 1727096198.46504: done checking for max_fail_percentage 13355 1727096198.46504: checking to see if all hosts have failed and the running result is not ok 13355 1727096198.46505: done checking to see if all hosts have failed 13355 1727096198.46506: getting the remaining hosts for this loop 13355 1727096198.46508: done getting the remaining hosts for this loop 13355 1727096198.46512: getting the next task for host managed_node3 13355 1727096198.46521: done getting next task for host managed_node3 13355 1727096198.46525: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096198.46531: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096198.46557: getting variables 13355 1727096198.46559: in VariableManager get_vars() 13355 1727096198.46622: Calling all_inventory to load vars for managed_node3 13355 1727096198.46625: Calling groups_inventory to load vars for managed_node3 13355 1727096198.46628: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096198.46640: Calling all_plugins_play to load vars for managed_node3 13355 1727096198.46643: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096198.46646: Calling groups_plugins_play to load vars for managed_node3 13355 1727096198.48560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096198.50127: done with get_vars() 13355 1727096198.50161: done getting variables 13355 1727096198.50226: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Monday 23 September 2024 08:56:38 -0400 (0:00:00.069) 0:00:47.763 ****** 13355 1727096198.50266: entering _queue_task() for managed_node3/set_fact 13355 1727096198.50627: worker is 1 (out of 1 available) 13355 1727096198.50640: exiting _queue_task() for managed_node3/set_fact 13355 1727096198.50652: done queuing things up, now waiting for results queue to drain 13355 1727096198.50653: waiting for pending results... 13355 1727096198.50941: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13355 1727096198.51119: in run() - task 0afff68d-5257-c514-593f-000000000917 13355 1727096198.51140: variable 'ansible_search_path' from source: unknown 13355 1727096198.51148: variable 'ansible_search_path' from source: unknown 13355 1727096198.51190: calling self._execute() 13355 1727096198.51290: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.51307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.51472: variable 'omit' from source: magic vars 13355 1727096198.51686: variable 'ansible_distribution_major_version' from source: facts 13355 1727096198.51720: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096198.51882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096198.52188: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096198.52262: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096198.52304: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096198.52341: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096198.52434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096198.52466: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096198.52501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096198.52582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096198.52627: variable '__network_is_ostree' from source: set_fact 13355 1727096198.52642: Evaluated conditional (not __network_is_ostree is defined): False 13355 1727096198.52649: when evaluation is False, skipping this task 13355 1727096198.52655: _execute() done 13355 1727096198.52662: dumping result to json 13355 1727096198.52670: done dumping result, returning 13355 1727096198.52686: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0afff68d-5257-c514-593f-000000000917] 13355 1727096198.52699: sending task result for task 0afff68d-5257-c514-593f-000000000917 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13355 1727096198.52845: no more pending results, returning what we have 13355 1727096198.52849: results queue empty 13355 1727096198.52850: checking for any_errors_fatal 13355 1727096198.52859: done checking for any_errors_fatal 13355 1727096198.52860: checking for max_fail_percentage 13355 1727096198.52862: done checking for max_fail_percentage 13355 1727096198.52863: checking to see if all hosts have failed and the running result is not ok 13355 1727096198.52863: done checking to see if all hosts have failed 13355 1727096198.52864: getting the remaining hosts for this loop 13355 1727096198.52866: done getting the remaining hosts for this loop 13355 1727096198.52871: getting the next task for host managed_node3 13355 1727096198.52882: done getting next task for host managed_node3 13355 1727096198.52886: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096198.52892: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096198.52916: getting variables 13355 1727096198.52917: in VariableManager get_vars() 13355 1727096198.53076: Calling all_inventory to load vars for managed_node3 13355 1727096198.53079: Calling groups_inventory to load vars for managed_node3 13355 1727096198.53082: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096198.53093: Calling all_plugins_play to load vars for managed_node3 13355 1727096198.53096: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096198.53099: Calling groups_plugins_play to load vars for managed_node3 13355 1727096198.53833: done sending task result for task 0afff68d-5257-c514-593f-000000000917 13355 1727096198.53837: WORKER PROCESS EXITING 13355 1727096198.56298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096198.57990: done with get_vars() 13355 1727096198.58013: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Monday 23 September 2024 08:56:38 -0400 (0:00:00.078) 0:00:47.841 ****** 13355 1727096198.58107: entering _queue_task() for managed_node3/service_facts 13355 1727096198.58365: worker is 1 (out of 1 available) 13355 1727096198.58379: exiting _queue_task() for managed_node3/service_facts 13355 1727096198.58391: done queuing things up, now waiting for results queue to drain 13355 1727096198.58392: waiting for pending results... 13355 1727096198.58573: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 13355 1727096198.58690: in run() - task 0afff68d-5257-c514-593f-000000000919 13355 1727096198.58701: variable 'ansible_search_path' from source: unknown 13355 1727096198.58704: variable 'ansible_search_path' from source: unknown 13355 1727096198.58734: calling self._execute() 13355 1727096198.58809: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.58813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.58821: variable 'omit' from source: magic vars 13355 1727096198.59118: variable 'ansible_distribution_major_version' from source: facts 13355 1727096198.59128: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096198.59133: variable 'omit' from source: magic vars 13355 1727096198.59190: variable 'omit' from source: magic vars 13355 1727096198.59215: variable 'omit' from source: magic vars 13355 1727096198.59247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096198.59474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096198.59478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096198.59481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096198.59484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096198.59486: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096198.59488: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.59490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.59493: Set connection var ansible_shell_executable to /bin/sh 13355 1727096198.59495: Set connection var ansible_shell_type to sh 13355 1727096198.59497: Set connection var ansible_pipelining to False 13355 1727096198.59499: Set connection var ansible_connection to ssh 13355 1727096198.59501: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096198.59503: Set connection var ansible_timeout to 10 13355 1727096198.59505: variable 'ansible_shell_executable' from source: unknown 13355 1727096198.59508: variable 'ansible_connection' from source: unknown 13355 1727096198.59510: variable 'ansible_module_compression' from source: unknown 13355 1727096198.59512: variable 'ansible_shell_type' from source: unknown 13355 1727096198.59514: variable 'ansible_shell_executable' from source: unknown 13355 1727096198.59516: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096198.59518: variable 'ansible_pipelining' from source: unknown 13355 1727096198.59519: variable 'ansible_timeout' from source: unknown 13355 1727096198.59521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096198.59678: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096198.59690: variable 'omit' from source: magic vars 13355 1727096198.59693: starting attempt loop 13355 1727096198.59696: running the handler 13355 1727096198.59706: _low_level_execute_command(): starting 13355 1727096198.59714: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096198.60220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096198.60224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.60228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096198.60233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.60281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096198.60287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096198.60338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096198.62006: stdout chunk (state=3): >>>/root <<< 13355 1727096198.62106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096198.62132: stderr chunk (state=3): >>><<< 13355 1727096198.62136: stdout chunk (state=3): >>><<< 13355 1727096198.62156: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096198.62174: _low_level_execute_command(): starting 13355 1727096198.62180: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708 `" && echo ansible-tmp-1727096198.6215951-15482-39236332515708="` echo /root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708 `" ) && sleep 0' 13355 1727096198.62625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096198.62628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.62631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096198.62642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096198.62645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.62687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096198.62690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096198.62697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096198.62733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096198.64651: stdout chunk (state=3): >>>ansible-tmp-1727096198.6215951-15482-39236332515708=/root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708 <<< 13355 1727096198.64773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096198.64783: stderr chunk (state=3): >>><<< 13355 1727096198.64786: stdout chunk (state=3): >>><<< 13355 1727096198.64802: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096198.6215951-15482-39236332515708=/root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096198.64843: variable 'ansible_module_compression' from source: unknown 13355 1727096198.64883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13355 1727096198.64918: variable 'ansible_facts' from source: unknown 13355 1727096198.64971: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/AnsiballZ_service_facts.py 13355 1727096198.65085: Sending initial data 13355 1727096198.65088: Sent initial data (161 bytes) 13355 1727096198.65551: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096198.65554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.65556: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096198.65559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096198.65561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.65622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096198.65625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096198.65654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096198.67273: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096198.67331: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096198.67335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp6r74q1ms /root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/AnsiballZ_service_facts.py <<< 13355 1727096198.67357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/AnsiballZ_service_facts.py" <<< 13355 1727096198.67400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp6r74q1ms" to remote "/root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/AnsiballZ_service_facts.py" <<< 13355 1727096198.68279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096198.68283: stdout chunk (state=3): >>><<< 13355 1727096198.68285: stderr chunk (state=3): >>><<< 13355 1727096198.68287: done transferring module to remote 13355 1727096198.68288: _low_level_execute_command(): starting 13355 1727096198.68290: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/ /root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/AnsiballZ_service_facts.py && sleep 0' 13355 1727096198.68715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096198.68719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096198.68749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.68752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096198.68754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096198.68763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.68811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096198.68814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096198.68821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096198.68852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096198.70696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096198.70721: stderr chunk (state=3): >>><<< 13355 1727096198.70724: stdout chunk (state=3): >>><<< 13355 1727096198.70738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096198.70741: _low_level_execute_command(): starting 13355 1727096198.70747: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/AnsiballZ_service_facts.py && sleep 0' 13355 1727096198.71208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096198.71212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.71214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096198.71216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096198.71270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096198.71274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096198.71277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096198.71322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096200.36761: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 13355 1727096200.36780: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13355 1727096200.38394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096200.38399: stdout chunk (state=3): >>><<< 13355 1727096200.38406: stderr chunk (state=3): >>><<< 13355 1727096200.38438: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096200.39966: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096200.39981: _low_level_execute_command(): starting 13355 1727096200.40034: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096198.6215951-15482-39236332515708/ > /dev/null 2>&1 && sleep 0' 13355 1727096200.41166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096200.41564: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096200.41570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096200.41572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096200.41575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096200.41577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096200.43519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096200.43673: stderr chunk (state=3): >>><<< 13355 1727096200.43677: stdout chunk (state=3): >>><<< 13355 1727096200.43680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096200.43682: handler run complete 13355 1727096200.44012: variable 'ansible_facts' from source: unknown 13355 1727096200.44269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096200.45549: variable 'ansible_facts' from source: unknown 13355 1727096200.46013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096200.46644: attempt loop complete, returning result 13355 1727096200.46673: _execute() done 13355 1727096200.46676: dumping result to json 13355 1727096200.46782: done dumping result, returning 13355 1727096200.46785: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [0afff68d-5257-c514-593f-000000000919] 13355 1727096200.46788: sending task result for task 0afff68d-5257-c514-593f-000000000919 13355 1727096200.48882: done sending task result for task 0afff68d-5257-c514-593f-000000000919 13355 1727096200.48973: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096200.49072: no more pending results, returning what we have 13355 1727096200.49075: results queue empty 13355 1727096200.49076: checking for any_errors_fatal 13355 1727096200.49082: done checking for any_errors_fatal 13355 1727096200.49082: checking for max_fail_percentage 13355 1727096200.49084: done checking for max_fail_percentage 13355 1727096200.49085: checking to see if all hosts have failed and the running result is not ok 13355 1727096200.49086: done checking to see if all hosts have failed 13355 1727096200.49086: getting the remaining hosts for this loop 13355 1727096200.49088: done getting the remaining hosts for this loop 13355 1727096200.49091: getting the next task for host managed_node3 13355 1727096200.49098: done getting next task for host managed_node3 13355 1727096200.49102: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096200.49107: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096200.49119: getting variables 13355 1727096200.49121: in VariableManager get_vars() 13355 1727096200.49167: Calling all_inventory to load vars for managed_node3 13355 1727096200.49373: Calling groups_inventory to load vars for managed_node3 13355 1727096200.49376: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096200.49385: Calling all_plugins_play to load vars for managed_node3 13355 1727096200.49388: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096200.49391: Calling groups_plugins_play to load vars for managed_node3 13355 1727096200.52778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096200.56303: done with get_vars() 13355 1727096200.56389: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Monday 23 September 2024 08:56:40 -0400 (0:00:01.985) 0:00:49.826 ****** 13355 1727096200.56619: entering _queue_task() for managed_node3/package_facts 13355 1727096200.57553: worker is 1 (out of 1 available) 13355 1727096200.57569: exiting _queue_task() for managed_node3/package_facts 13355 1727096200.57580: done queuing things up, now waiting for results queue to drain 13355 1727096200.57581: waiting for pending results... 13355 1727096200.58186: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 13355 1727096200.58334: in run() - task 0afff68d-5257-c514-593f-00000000091a 13355 1727096200.58359: variable 'ansible_search_path' from source: unknown 13355 1727096200.58396: variable 'ansible_search_path' from source: unknown 13355 1727096200.58507: calling self._execute() 13355 1727096200.58677: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096200.58726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096200.58935: variable 'omit' from source: magic vars 13355 1727096200.59602: variable 'ansible_distribution_major_version' from source: facts 13355 1727096200.59720: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096200.59724: variable 'omit' from source: magic vars 13355 1727096200.59859: variable 'omit' from source: magic vars 13355 1727096200.59981: variable 'omit' from source: magic vars 13355 1727096200.60177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096200.60192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096200.60215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096200.60239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096200.60264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096200.60492: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096200.60495: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096200.60497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096200.60627: Set connection var ansible_shell_executable to /bin/sh 13355 1727096200.60641: Set connection var ansible_shell_type to sh 13355 1727096200.60718: Set connection var ansible_pipelining to False 13355 1727096200.60728: Set connection var ansible_connection to ssh 13355 1727096200.60737: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096200.60746: Set connection var ansible_timeout to 10 13355 1727096200.60779: variable 'ansible_shell_executable' from source: unknown 13355 1727096200.60788: variable 'ansible_connection' from source: unknown 13355 1727096200.60825: variable 'ansible_module_compression' from source: unknown 13355 1727096200.60833: variable 'ansible_shell_type' from source: unknown 13355 1727096200.60840: variable 'ansible_shell_executable' from source: unknown 13355 1727096200.60846: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096200.60852: variable 'ansible_pipelining' from source: unknown 13355 1727096200.60972: variable 'ansible_timeout' from source: unknown 13355 1727096200.60975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096200.61311: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096200.61383: variable 'omit' from source: magic vars 13355 1727096200.61393: starting attempt loop 13355 1727096200.61399: running the handler 13355 1727096200.61417: _low_level_execute_command(): starting 13355 1727096200.61586: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096200.62946: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096200.63047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096200.63162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096200.63203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096200.63235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096200.65042: stdout chunk (state=3): >>>/root <<< 13355 1727096200.65279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096200.65293: stdout chunk (state=3): >>><<< 13355 1727096200.65305: stderr chunk (state=3): >>><<< 13355 1727096200.65328: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096200.65692: _low_level_execute_command(): starting 13355 1727096200.65696: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928 `" && echo ansible-tmp-1727096200.6547844-15532-55360797873928="` echo /root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928 `" ) && sleep 0' 13355 1727096200.66722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096200.66740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096200.66902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096200.66915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096200.66973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096200.69097: stdout chunk (state=3): >>>ansible-tmp-1727096200.6547844-15532-55360797873928=/root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928 <<< 13355 1727096200.69118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096200.69155: stderr chunk (state=3): >>><<< 13355 1727096200.69222: stdout chunk (state=3): >>><<< 13355 1727096200.69248: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096200.6547844-15532-55360797873928=/root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096200.69307: variable 'ansible_module_compression' from source: unknown 13355 1727096200.69574: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13355 1727096200.69578: variable 'ansible_facts' from source: unknown 13355 1727096200.70123: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/AnsiballZ_package_facts.py 13355 1727096200.70307: Sending initial data 13355 1727096200.70415: Sent initial data (161 bytes) 13355 1727096200.71788: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096200.71986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096200.72014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096200.73716: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096200.73790: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/AnsiballZ_package_facts.py" <<< 13355 1727096200.73804: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp1mw4xpdn /root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/AnsiballZ_package_facts.py <<< 13355 1727096200.73818: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp1mw4xpdn" to remote "/root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/AnsiballZ_package_facts.py" <<< 13355 1727096200.76869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096200.76874: stderr chunk (state=3): >>><<< 13355 1727096200.76876: stdout chunk (state=3): >>><<< 13355 1727096200.76879: done transferring module to remote 13355 1727096200.77172: _low_level_execute_command(): starting 13355 1727096200.77176: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/ /root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/AnsiballZ_package_facts.py && sleep 0' 13355 1727096200.78200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096200.78204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096200.78443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096200.78482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096200.78501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096200.78552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096200.80509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096200.80537: stderr chunk (state=3): >>><<< 13355 1727096200.80569: stdout chunk (state=3): >>><<< 13355 1727096200.80684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096200.80700: _low_level_execute_command(): starting 13355 1727096200.80711: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/AnsiballZ_package_facts.py && sleep 0' 13355 1727096200.81873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096200.81922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096200.81935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096200.82044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096200.82058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096200.82138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096200.82272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096201.27439: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 13355 1727096201.27557: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13355 1727096201.29476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096201.29481: stdout chunk (state=3): >>><<< 13355 1727096201.29483: stderr chunk (state=3): >>><<< 13355 1727096201.29577: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096201.33007: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096201.33037: _low_level_execute_command(): starting 13355 1727096201.33050: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096200.6547844-15532-55360797873928/ > /dev/null 2>&1 && sleep 0' 13355 1727096201.33659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096201.33680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096201.33698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096201.33715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096201.33733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096201.33746: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096201.33787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096201.33860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096201.33883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096201.33938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096201.34044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096201.36274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096201.36279: stdout chunk (state=3): >>><<< 13355 1727096201.36284: stderr chunk (state=3): >>><<< 13355 1727096201.36316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096201.36320: handler run complete 13355 1727096201.38365: variable 'ansible_facts' from source: unknown 13355 1727096201.39465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096201.43992: variable 'ansible_facts' from source: unknown 13355 1727096201.45227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096201.46868: attempt loop complete, returning result 13355 1727096201.46887: _execute() done 13355 1727096201.46890: dumping result to json 13355 1727096201.47415: done dumping result, returning 13355 1727096201.47575: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0afff68d-5257-c514-593f-00000000091a] 13355 1727096201.47579: sending task result for task 0afff68d-5257-c514-593f-00000000091a 13355 1727096201.51014: done sending task result for task 0afff68d-5257-c514-593f-00000000091a 13355 1727096201.51018: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096201.51149: no more pending results, returning what we have 13355 1727096201.51153: results queue empty 13355 1727096201.51154: checking for any_errors_fatal 13355 1727096201.51162: done checking for any_errors_fatal 13355 1727096201.51163: checking for max_fail_percentage 13355 1727096201.51165: done checking for max_fail_percentage 13355 1727096201.51166: checking to see if all hosts have failed and the running result is not ok 13355 1727096201.51176: done checking to see if all hosts have failed 13355 1727096201.51177: getting the remaining hosts for this loop 13355 1727096201.51178: done getting the remaining hosts for this loop 13355 1727096201.51182: getting the next task for host managed_node3 13355 1727096201.51189: done getting next task for host managed_node3 13355 1727096201.51192: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096201.51197: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096201.51211: getting variables 13355 1727096201.51213: in VariableManager get_vars() 13355 1727096201.51259: Calling all_inventory to load vars for managed_node3 13355 1727096201.51262: Calling groups_inventory to load vars for managed_node3 13355 1727096201.51265: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096201.51285: Calling all_plugins_play to load vars for managed_node3 13355 1727096201.51288: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096201.51292: Calling groups_plugins_play to load vars for managed_node3 13355 1727096201.52395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096201.53978: done with get_vars() 13355 1727096201.54037: done getting variables 13355 1727096201.54155: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Monday 23 September 2024 08:56:41 -0400 (0:00:00.975) 0:00:50.802 ****** 13355 1727096201.54220: entering _queue_task() for managed_node3/debug 13355 1727096201.54665: worker is 1 (out of 1 available) 13355 1727096201.54686: exiting _queue_task() for managed_node3/debug 13355 1727096201.54703: done queuing things up, now waiting for results queue to drain 13355 1727096201.54705: waiting for pending results... 13355 1727096201.55165: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 13355 1727096201.55238: in run() - task 0afff68d-5257-c514-593f-00000000016d 13355 1727096201.55288: variable 'ansible_search_path' from source: unknown 13355 1727096201.55295: variable 'ansible_search_path' from source: unknown 13355 1727096201.55431: calling self._execute() 13355 1727096201.55477: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096201.55592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096201.55621: variable 'omit' from source: magic vars 13355 1727096201.56140: variable 'ansible_distribution_major_version' from source: facts 13355 1727096201.56159: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096201.56176: variable 'omit' from source: magic vars 13355 1727096201.56254: variable 'omit' from source: magic vars 13355 1727096201.56383: variable 'network_provider' from source: set_fact 13355 1727096201.56401: variable 'omit' from source: magic vars 13355 1727096201.56474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096201.56478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096201.56536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096201.56539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096201.56542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096201.56582: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096201.56585: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096201.56593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096201.56701: Set connection var ansible_shell_executable to /bin/sh 13355 1727096201.56705: Set connection var ansible_shell_type to sh 13355 1727096201.56707: Set connection var ansible_pipelining to False 13355 1727096201.56709: Set connection var ansible_connection to ssh 13355 1727096201.56712: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096201.57076: Set connection var ansible_timeout to 10 13355 1727096201.57079: variable 'ansible_shell_executable' from source: unknown 13355 1727096201.57081: variable 'ansible_connection' from source: unknown 13355 1727096201.57083: variable 'ansible_module_compression' from source: unknown 13355 1727096201.57085: variable 'ansible_shell_type' from source: unknown 13355 1727096201.57087: variable 'ansible_shell_executable' from source: unknown 13355 1727096201.57089: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096201.57090: variable 'ansible_pipelining' from source: unknown 13355 1727096201.57092: variable 'ansible_timeout' from source: unknown 13355 1727096201.57093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096201.57096: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096201.57098: variable 'omit' from source: magic vars 13355 1727096201.57099: starting attempt loop 13355 1727096201.57101: running the handler 13355 1727096201.57103: handler run complete 13355 1727096201.57105: attempt loop complete, returning result 13355 1727096201.57107: _execute() done 13355 1727096201.57108: dumping result to json 13355 1727096201.57110: done dumping result, returning 13355 1727096201.57111: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [0afff68d-5257-c514-593f-00000000016d] 13355 1727096201.57113: sending task result for task 0afff68d-5257-c514-593f-00000000016d 13355 1727096201.57184: done sending task result for task 0afff68d-5257-c514-593f-00000000016d 13355 1727096201.57188: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 13355 1727096201.57259: no more pending results, returning what we have 13355 1727096201.57262: results queue empty 13355 1727096201.57263: checking for any_errors_fatal 13355 1727096201.57275: done checking for any_errors_fatal 13355 1727096201.57276: checking for max_fail_percentage 13355 1727096201.57278: done checking for max_fail_percentage 13355 1727096201.57279: checking to see if all hosts have failed and the running result is not ok 13355 1727096201.57279: done checking to see if all hosts have failed 13355 1727096201.57280: getting the remaining hosts for this loop 13355 1727096201.57281: done getting the remaining hosts for this loop 13355 1727096201.57285: getting the next task for host managed_node3 13355 1727096201.57293: done getting next task for host managed_node3 13355 1727096201.57296: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096201.57301: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096201.57314: getting variables 13355 1727096201.57316: in VariableManager get_vars() 13355 1727096201.57574: Calling all_inventory to load vars for managed_node3 13355 1727096201.57577: Calling groups_inventory to load vars for managed_node3 13355 1727096201.57580: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096201.57590: Calling all_plugins_play to load vars for managed_node3 13355 1727096201.57593: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096201.57596: Calling groups_plugins_play to load vars for managed_node3 13355 1727096201.59745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096201.61655: done with get_vars() 13355 1727096201.61687: done getting variables 13355 1727096201.61731: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Monday 23 September 2024 08:56:41 -0400 (0:00:00.075) 0:00:50.878 ****** 13355 1727096201.61764: entering _queue_task() for managed_node3/fail 13355 1727096201.62048: worker is 1 (out of 1 available) 13355 1727096201.62064: exiting _queue_task() for managed_node3/fail 13355 1727096201.62078: done queuing things up, now waiting for results queue to drain 13355 1727096201.62080: waiting for pending results... 13355 1727096201.62359: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13355 1727096201.62869: in run() - task 0afff68d-5257-c514-593f-00000000016e 13355 1727096201.62874: variable 'ansible_search_path' from source: unknown 13355 1727096201.62877: variable 'ansible_search_path' from source: unknown 13355 1727096201.62975: calling self._execute() 13355 1727096201.63192: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096201.63196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096201.63199: variable 'omit' from source: magic vars 13355 1727096201.64175: variable 'ansible_distribution_major_version' from source: facts 13355 1727096201.64180: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096201.64416: variable 'network_state' from source: role '' defaults 13355 1727096201.64419: Evaluated conditional (network_state != {}): False 13355 1727096201.64423: when evaluation is False, skipping this task 13355 1727096201.64425: _execute() done 13355 1727096201.64428: dumping result to json 13355 1727096201.64430: done dumping result, returning 13355 1727096201.64432: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0afff68d-5257-c514-593f-00000000016e] 13355 1727096201.64434: sending task result for task 0afff68d-5257-c514-593f-00000000016e 13355 1727096201.64774: done sending task result for task 0afff68d-5257-c514-593f-00000000016e 13355 1727096201.64779: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096201.64819: no more pending results, returning what we have 13355 1727096201.64822: results queue empty 13355 1727096201.64823: checking for any_errors_fatal 13355 1727096201.64830: done checking for any_errors_fatal 13355 1727096201.64831: checking for max_fail_percentage 13355 1727096201.64832: done checking for max_fail_percentage 13355 1727096201.64833: checking to see if all hosts have failed and the running result is not ok 13355 1727096201.64834: done checking to see if all hosts have failed 13355 1727096201.64834: getting the remaining hosts for this loop 13355 1727096201.64836: done getting the remaining hosts for this loop 13355 1727096201.64839: getting the next task for host managed_node3 13355 1727096201.64845: done getting next task for host managed_node3 13355 1727096201.64849: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096201.64853: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096201.64874: getting variables 13355 1727096201.64875: in VariableManager get_vars() 13355 1727096201.64924: Calling all_inventory to load vars for managed_node3 13355 1727096201.64927: Calling groups_inventory to load vars for managed_node3 13355 1727096201.64930: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096201.64938: Calling all_plugins_play to load vars for managed_node3 13355 1727096201.64941: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096201.64944: Calling groups_plugins_play to load vars for managed_node3 13355 1727096201.66464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096201.68858: done with get_vars() 13355 1727096201.68898: done getting variables 13355 1727096201.68953: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Monday 23 September 2024 08:56:41 -0400 (0:00:00.072) 0:00:50.950 ****** 13355 1727096201.68993: entering _queue_task() for managed_node3/fail 13355 1727096201.69458: worker is 1 (out of 1 available) 13355 1727096201.69476: exiting _queue_task() for managed_node3/fail 13355 1727096201.69490: done queuing things up, now waiting for results queue to drain 13355 1727096201.69491: waiting for pending results... 13355 1727096201.69801: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13355 1727096201.70075: in run() - task 0afff68d-5257-c514-593f-00000000016f 13355 1727096201.70078: variable 'ansible_search_path' from source: unknown 13355 1727096201.70081: variable 'ansible_search_path' from source: unknown 13355 1727096201.70084: calling self._execute() 13355 1727096201.70137: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096201.70154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096201.70176: variable 'omit' from source: magic vars 13355 1727096201.70892: variable 'ansible_distribution_major_version' from source: facts 13355 1727096201.70897: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096201.70910: variable 'network_state' from source: role '' defaults 13355 1727096201.70992: Evaluated conditional (network_state != {}): False 13355 1727096201.71000: when evaluation is False, skipping this task 13355 1727096201.71007: _execute() done 13355 1727096201.71014: dumping result to json 13355 1727096201.71022: done dumping result, returning 13355 1727096201.71037: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0afff68d-5257-c514-593f-00000000016f] 13355 1727096201.71047: sending task result for task 0afff68d-5257-c514-593f-00000000016f 13355 1727096201.71348: done sending task result for task 0afff68d-5257-c514-593f-00000000016f 13355 1727096201.71352: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096201.71406: no more pending results, returning what we have 13355 1727096201.71410: results queue empty 13355 1727096201.71410: checking for any_errors_fatal 13355 1727096201.71425: done checking for any_errors_fatal 13355 1727096201.71425: checking for max_fail_percentage 13355 1727096201.71428: done checking for max_fail_percentage 13355 1727096201.71429: checking to see if all hosts have failed and the running result is not ok 13355 1727096201.71429: done checking to see if all hosts have failed 13355 1727096201.71430: getting the remaining hosts for this loop 13355 1727096201.71433: done getting the remaining hosts for this loop 13355 1727096201.71548: getting the next task for host managed_node3 13355 1727096201.71559: done getting next task for host managed_node3 13355 1727096201.71565: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096201.71571: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096201.71593: getting variables 13355 1727096201.71594: in VariableManager get_vars() 13355 1727096201.71639: Calling all_inventory to load vars for managed_node3 13355 1727096201.71642: Calling groups_inventory to load vars for managed_node3 13355 1727096201.71644: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096201.71653: Calling all_plugins_play to load vars for managed_node3 13355 1727096201.71655: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096201.71661: Calling groups_plugins_play to load vars for managed_node3 13355 1727096201.73108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096201.74657: done with get_vars() 13355 1727096201.74698: done getting variables 13355 1727096201.74761: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Monday 23 September 2024 08:56:41 -0400 (0:00:00.058) 0:00:51.008 ****** 13355 1727096201.74800: entering _queue_task() for managed_node3/fail 13355 1727096201.75179: worker is 1 (out of 1 available) 13355 1727096201.75192: exiting _queue_task() for managed_node3/fail 13355 1727096201.75206: done queuing things up, now waiting for results queue to drain 13355 1727096201.75207: waiting for pending results... 13355 1727096201.75590: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13355 1727096201.75663: in run() - task 0afff68d-5257-c514-593f-000000000170 13355 1727096201.75687: variable 'ansible_search_path' from source: unknown 13355 1727096201.75694: variable 'ansible_search_path' from source: unknown 13355 1727096201.75733: calling self._execute() 13355 1727096201.75835: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096201.75846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096201.75858: variable 'omit' from source: magic vars 13355 1727096201.76231: variable 'ansible_distribution_major_version' from source: facts 13355 1727096201.76248: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096201.76427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096201.84148: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096201.84203: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096201.84229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096201.84252: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096201.84276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096201.84328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096201.84348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096201.84366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096201.84399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096201.84411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096201.84479: variable 'ansible_distribution_major_version' from source: facts 13355 1727096201.84491: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13355 1727096201.84574: variable 'ansible_distribution' from source: facts 13355 1727096201.84578: variable '__network_rh_distros' from source: role '' defaults 13355 1727096201.84585: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13355 1727096201.84744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096201.84762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096201.84780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096201.84805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096201.84818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096201.84852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096201.84870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096201.84886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096201.84913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096201.84926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096201.84955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096201.84976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096201.84991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096201.85015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096201.85026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096201.85213: variable 'network_connections' from source: task vars 13355 1727096201.85221: variable 'controller_profile' from source: play vars 13355 1727096201.85271: variable 'controller_profile' from source: play vars 13355 1727096201.85280: variable 'network_state' from source: role '' defaults 13355 1727096201.85323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096201.85445: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096201.85479: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096201.85498: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096201.85519: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096201.85549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096201.85565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096201.85611: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096201.85628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096201.85647: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13355 1727096201.85650: when evaluation is False, skipping this task 13355 1727096201.85652: _execute() done 13355 1727096201.85655: dumping result to json 13355 1727096201.85660: done dumping result, returning 13355 1727096201.85665: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0afff68d-5257-c514-593f-000000000170] 13355 1727096201.85671: sending task result for task 0afff68d-5257-c514-593f-000000000170 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13355 1727096201.85803: no more pending results, returning what we have 13355 1727096201.85815: results queue empty 13355 1727096201.85816: checking for any_errors_fatal 13355 1727096201.85866: done checking for any_errors_fatal 13355 1727096201.85869: checking for max_fail_percentage 13355 1727096201.85871: done checking for max_fail_percentage 13355 1727096201.85872: checking to see if all hosts have failed and the running result is not ok 13355 1727096201.85873: done checking to see if all hosts have failed 13355 1727096201.85873: getting the remaining hosts for this loop 13355 1727096201.85875: done getting the remaining hosts for this loop 13355 1727096201.85879: getting the next task for host managed_node3 13355 1727096201.85886: done getting next task for host managed_node3 13355 1727096201.85890: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096201.85894: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096201.85904: done sending task result for task 0afff68d-5257-c514-593f-000000000170 13355 1727096201.85907: WORKER PROCESS EXITING 13355 1727096201.85930: getting variables 13355 1727096201.85931: in VariableManager get_vars() 13355 1727096201.86098: Calling all_inventory to load vars for managed_node3 13355 1727096201.86101: Calling groups_inventory to load vars for managed_node3 13355 1727096201.86103: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096201.86111: Calling all_plugins_play to load vars for managed_node3 13355 1727096201.86114: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096201.86116: Calling groups_plugins_play to load vars for managed_node3 13355 1727096201.93731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096201.95411: done with get_vars() 13355 1727096201.95446: done getting variables 13355 1727096201.95509: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Monday 23 September 2024 08:56:41 -0400 (0:00:00.207) 0:00:51.217 ****** 13355 1727096201.95664: entering _queue_task() for managed_node3/dnf 13355 1727096201.96296: worker is 1 (out of 1 available) 13355 1727096201.96309: exiting _queue_task() for managed_node3/dnf 13355 1727096201.96439: done queuing things up, now waiting for results queue to drain 13355 1727096201.96442: waiting for pending results... 13355 1727096201.96629: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13355 1727096201.96879: in run() - task 0afff68d-5257-c514-593f-000000000171 13355 1727096201.96883: variable 'ansible_search_path' from source: unknown 13355 1727096201.96885: variable 'ansible_search_path' from source: unknown 13355 1727096201.96888: calling self._execute() 13355 1727096201.97007: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096201.97073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096201.97083: variable 'omit' from source: magic vars 13355 1727096201.97554: variable 'ansible_distribution_major_version' from source: facts 13355 1727096201.97575: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096201.97832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096202.02269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096202.02501: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096202.02771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096202.02776: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096202.02778: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096202.02781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.02784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.02872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.02982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.03006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.03225: variable 'ansible_distribution' from source: facts 13355 1727096202.03242: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.03269: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13355 1727096202.03610: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096202.03778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.03872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.03875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.03897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.03920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.03962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.03994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.04026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.04080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.04183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.04200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.04472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.04476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.04478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.04481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.04718: variable 'network_connections' from source: task vars 13355 1727096202.04764: variable 'controller_profile' from source: play vars 13355 1727096202.04883: variable 'controller_profile' from source: play vars 13355 1727096202.04992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096202.05234: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096202.05302: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096202.05372: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096202.05376: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096202.05428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096202.05464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096202.05537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.05545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096202.05600: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096202.05986: variable 'network_connections' from source: task vars 13355 1727096202.05989: variable 'controller_profile' from source: play vars 13355 1727096202.06040: variable 'controller_profile' from source: play vars 13355 1727096202.06078: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096202.06092: when evaluation is False, skipping this task 13355 1727096202.06099: _execute() done 13355 1727096202.06105: dumping result to json 13355 1727096202.06131: done dumping result, returning 13355 1727096202.06196: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000171] 13355 1727096202.06204: sending task result for task 0afff68d-5257-c514-593f-000000000171 13355 1727096202.06649: done sending task result for task 0afff68d-5257-c514-593f-000000000171 13355 1727096202.06652: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096202.06734: no more pending results, returning what we have 13355 1727096202.06738: results queue empty 13355 1727096202.06739: checking for any_errors_fatal 13355 1727096202.06748: done checking for any_errors_fatal 13355 1727096202.06749: checking for max_fail_percentage 13355 1727096202.06751: done checking for max_fail_percentage 13355 1727096202.06752: checking to see if all hosts have failed and the running result is not ok 13355 1727096202.06753: done checking to see if all hosts have failed 13355 1727096202.06754: getting the remaining hosts for this loop 13355 1727096202.06755: done getting the remaining hosts for this loop 13355 1727096202.06759: getting the next task for host managed_node3 13355 1727096202.06769: done getting next task for host managed_node3 13355 1727096202.06774: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096202.06778: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096202.06810: getting variables 13355 1727096202.06812: in VariableManager get_vars() 13355 1727096202.07077: Calling all_inventory to load vars for managed_node3 13355 1727096202.07080: Calling groups_inventory to load vars for managed_node3 13355 1727096202.07083: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096202.07092: Calling all_plugins_play to load vars for managed_node3 13355 1727096202.07095: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096202.07098: Calling groups_plugins_play to load vars for managed_node3 13355 1727096202.10525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096202.14939: done with get_vars() 13355 1727096202.14965: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13355 1727096202.15048: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Monday 23 September 2024 08:56:42 -0400 (0:00:00.195) 0:00:51.413 ****** 13355 1727096202.15270: entering _queue_task() for managed_node3/yum 13355 1727096202.16380: worker is 1 (out of 1 available) 13355 1727096202.16395: exiting _queue_task() for managed_node3/yum 13355 1727096202.16409: done queuing things up, now waiting for results queue to drain 13355 1727096202.16411: waiting for pending results... 13355 1727096202.16853: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13355 1727096202.16996: in run() - task 0afff68d-5257-c514-593f-000000000172 13355 1727096202.17005: variable 'ansible_search_path' from source: unknown 13355 1727096202.17008: variable 'ansible_search_path' from source: unknown 13355 1727096202.17046: calling self._execute() 13355 1727096202.17222: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096202.17229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096202.17237: variable 'omit' from source: magic vars 13355 1727096202.17646: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.17664: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096202.17892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096202.20505: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096202.20560: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096202.20614: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096202.20675: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096202.20929: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096202.21219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.21259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.21300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.21343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.21359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.21464: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.21690: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13355 1727096202.21725: when evaluation is False, skipping this task 13355 1727096202.21741: _execute() done 13355 1727096202.21745: dumping result to json 13355 1727096202.21747: done dumping result, returning 13355 1727096202.21749: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000172] 13355 1727096202.21752: sending task result for task 0afff68d-5257-c514-593f-000000000172 13355 1727096202.22101: done sending task result for task 0afff68d-5257-c514-593f-000000000172 13355 1727096202.22103: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13355 1727096202.22155: no more pending results, returning what we have 13355 1727096202.22160: results queue empty 13355 1727096202.22161: checking for any_errors_fatal 13355 1727096202.22166: done checking for any_errors_fatal 13355 1727096202.22166: checking for max_fail_percentage 13355 1727096202.22170: done checking for max_fail_percentage 13355 1727096202.22171: checking to see if all hosts have failed and the running result is not ok 13355 1727096202.22172: done checking to see if all hosts have failed 13355 1727096202.22172: getting the remaining hosts for this loop 13355 1727096202.22174: done getting the remaining hosts for this loop 13355 1727096202.22177: getting the next task for host managed_node3 13355 1727096202.22184: done getting next task for host managed_node3 13355 1727096202.22188: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096202.22192: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096202.22216: getting variables 13355 1727096202.22218: in VariableManager get_vars() 13355 1727096202.22266: Calling all_inventory to load vars for managed_node3 13355 1727096202.22422: Calling groups_inventory to load vars for managed_node3 13355 1727096202.22426: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096202.22435: Calling all_plugins_play to load vars for managed_node3 13355 1727096202.22438: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096202.22440: Calling groups_plugins_play to load vars for managed_node3 13355 1727096202.23795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096202.26400: done with get_vars() 13355 1727096202.26441: done getting variables 13355 1727096202.26505: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Monday 23 September 2024 08:56:42 -0400 (0:00:00.112) 0:00:51.526 ****** 13355 1727096202.26542: entering _queue_task() for managed_node3/fail 13355 1727096202.26950: worker is 1 (out of 1 available) 13355 1727096202.27171: exiting _queue_task() for managed_node3/fail 13355 1727096202.27186: done queuing things up, now waiting for results queue to drain 13355 1727096202.27187: waiting for pending results... 13355 1727096202.27325: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13355 1727096202.27473: in run() - task 0afff68d-5257-c514-593f-000000000173 13355 1727096202.27477: variable 'ansible_search_path' from source: unknown 13355 1727096202.27480: variable 'ansible_search_path' from source: unknown 13355 1727096202.27521: calling self._execute() 13355 1727096202.27688: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096202.27691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096202.27694: variable 'omit' from source: magic vars 13355 1727096202.28077: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.28093: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096202.28258: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096202.28507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096202.30903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096202.30989: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096202.31180: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096202.31184: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096202.31187: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096202.31207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.31228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.31253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.31297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.31311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.31363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.31388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.31415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.31552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.31555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.31558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.31560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.31571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.31610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.31624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.31822: variable 'network_connections' from source: task vars 13355 1727096202.31854: variable 'controller_profile' from source: play vars 13355 1727096202.31911: variable 'controller_profile' from source: play vars 13355 1727096202.31989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096202.32263: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096202.32671: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096202.32702: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096202.32728: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096202.32854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096202.32859: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096202.32863: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.32965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096202.32972: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096202.33356: variable 'network_connections' from source: task vars 13355 1727096202.33362: variable 'controller_profile' from source: play vars 13355 1727096202.33476: variable 'controller_profile' from source: play vars 13355 1727096202.33480: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096202.33486: when evaluation is False, skipping this task 13355 1727096202.33490: _execute() done 13355 1727096202.33513: dumping result to json 13355 1727096202.33518: done dumping result, returning 13355 1727096202.33522: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000173] 13355 1727096202.33526: sending task result for task 0afff68d-5257-c514-593f-000000000173 13355 1727096202.33771: done sending task result for task 0afff68d-5257-c514-593f-000000000173 13355 1727096202.33777: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096202.33848: no more pending results, returning what we have 13355 1727096202.33854: results queue empty 13355 1727096202.33858: checking for any_errors_fatal 13355 1727096202.33869: done checking for any_errors_fatal 13355 1727096202.33872: checking for max_fail_percentage 13355 1727096202.33876: done checking for max_fail_percentage 13355 1727096202.33877: checking to see if all hosts have failed and the running result is not ok 13355 1727096202.33877: done checking to see if all hosts have failed 13355 1727096202.33878: getting the remaining hosts for this loop 13355 1727096202.33879: done getting the remaining hosts for this loop 13355 1727096202.33883: getting the next task for host managed_node3 13355 1727096202.33894: done getting next task for host managed_node3 13355 1727096202.33899: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13355 1727096202.33903: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096202.33928: getting variables 13355 1727096202.33930: in VariableManager get_vars() 13355 1727096202.34183: Calling all_inventory to load vars for managed_node3 13355 1727096202.34186: Calling groups_inventory to load vars for managed_node3 13355 1727096202.34188: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096202.34198: Calling all_plugins_play to load vars for managed_node3 13355 1727096202.34201: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096202.34204: Calling groups_plugins_play to load vars for managed_node3 13355 1727096202.36827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096202.38664: done with get_vars() 13355 1727096202.38699: done getting variables 13355 1727096202.38747: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Monday 23 September 2024 08:56:42 -0400 (0:00:00.122) 0:00:51.648 ****** 13355 1727096202.38780: entering _queue_task() for managed_node3/package 13355 1727096202.39058: worker is 1 (out of 1 available) 13355 1727096202.39075: exiting _queue_task() for managed_node3/package 13355 1727096202.39086: done queuing things up, now waiting for results queue to drain 13355 1727096202.39087: waiting for pending results... 13355 1727096202.39271: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 13355 1727096202.39409: in run() - task 0afff68d-5257-c514-593f-000000000174 13355 1727096202.39413: variable 'ansible_search_path' from source: unknown 13355 1727096202.39415: variable 'ansible_search_path' from source: unknown 13355 1727096202.39449: calling self._execute() 13355 1727096202.39541: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096202.39547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096202.39574: variable 'omit' from source: magic vars 13355 1727096202.40012: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.40022: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096202.40379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096202.40901: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096202.40949: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096202.41079: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096202.41151: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096202.41377: variable 'network_packages' from source: role '' defaults 13355 1727096202.42016: variable '__network_provider_setup' from source: role '' defaults 13355 1727096202.42030: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096202.42296: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096202.42326: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096202.42398: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096202.43076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096202.47412: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096202.47473: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096202.47509: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096202.47540: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096202.47569: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096202.47979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.48109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.48155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.48420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.48423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.48426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.48429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.48431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.48434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.48436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.49230: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096202.49989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.49993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.49996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.49998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.50000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.50012: variable 'ansible_python' from source: facts 13355 1727096202.50041: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096202.50426: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096202.50506: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096202.50740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.50763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.50790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.50828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.50847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.51092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.51105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.51174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.51187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.51190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.51418: variable 'network_connections' from source: task vars 13355 1727096202.51425: variable 'controller_profile' from source: play vars 13355 1727096202.51527: variable 'controller_profile' from source: play vars 13355 1727096202.51599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096202.51626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096202.51658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.51725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096202.51734: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096202.52104: variable 'network_connections' from source: task vars 13355 1727096202.52107: variable 'controller_profile' from source: play vars 13355 1727096202.52469: variable 'controller_profile' from source: play vars 13355 1727096202.52475: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096202.52518: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096202.52981: variable 'network_connections' from source: task vars 13355 1727096202.52991: variable 'controller_profile' from source: play vars 13355 1727096202.53138: variable 'controller_profile' from source: play vars 13355 1727096202.53143: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096202.53197: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096202.53554: variable 'network_connections' from source: task vars 13355 1727096202.53558: variable 'controller_profile' from source: play vars 13355 1727096202.53611: variable 'controller_profile' from source: play vars 13355 1727096202.53651: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096202.53699: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096202.53705: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096202.53747: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096202.53891: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096202.54198: variable 'network_connections' from source: task vars 13355 1727096202.54202: variable 'controller_profile' from source: play vars 13355 1727096202.54247: variable 'controller_profile' from source: play vars 13355 1727096202.54253: variable 'ansible_distribution' from source: facts 13355 1727096202.54256: variable '__network_rh_distros' from source: role '' defaults 13355 1727096202.54265: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.54279: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096202.54387: variable 'ansible_distribution' from source: facts 13355 1727096202.54391: variable '__network_rh_distros' from source: role '' defaults 13355 1727096202.54395: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.54406: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096202.54526: variable 'ansible_distribution' from source: facts 13355 1727096202.54530: variable '__network_rh_distros' from source: role '' defaults 13355 1727096202.54538: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.54738: variable 'network_provider' from source: set_fact 13355 1727096202.54742: variable 'ansible_facts' from source: unknown 13355 1727096202.55431: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13355 1727096202.55435: when evaluation is False, skipping this task 13355 1727096202.55438: _execute() done 13355 1727096202.55440: dumping result to json 13355 1727096202.55442: done dumping result, returning 13355 1727096202.55450: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [0afff68d-5257-c514-593f-000000000174] 13355 1727096202.55454: sending task result for task 0afff68d-5257-c514-593f-000000000174 skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13355 1727096202.55607: no more pending results, returning what we have 13355 1727096202.55611: results queue empty 13355 1727096202.55611: checking for any_errors_fatal 13355 1727096202.55621: done checking for any_errors_fatal 13355 1727096202.55622: checking for max_fail_percentage 13355 1727096202.55624: done checking for max_fail_percentage 13355 1727096202.55625: checking to see if all hosts have failed and the running result is not ok 13355 1727096202.55625: done checking to see if all hosts have failed 13355 1727096202.55626: getting the remaining hosts for this loop 13355 1727096202.55627: done getting the remaining hosts for this loop 13355 1727096202.55631: getting the next task for host managed_node3 13355 1727096202.55638: done getting next task for host managed_node3 13355 1727096202.55642: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096202.55646: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096202.55672: getting variables 13355 1727096202.55673: in VariableManager get_vars() 13355 1727096202.55722: Calling all_inventory to load vars for managed_node3 13355 1727096202.55725: Calling groups_inventory to load vars for managed_node3 13355 1727096202.55727: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096202.55737: Calling all_plugins_play to load vars for managed_node3 13355 1727096202.55745: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096202.55748: Calling groups_plugins_play to load vars for managed_node3 13355 1727096202.56280: done sending task result for task 0afff68d-5257-c514-593f-000000000174 13355 1727096202.56284: WORKER PROCESS EXITING 13355 1727096202.56773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096202.58406: done with get_vars() 13355 1727096202.58444: done getting variables 13355 1727096202.58516: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Monday 23 September 2024 08:56:42 -0400 (0:00:00.197) 0:00:51.846 ****** 13355 1727096202.58555: entering _queue_task() for managed_node3/package 13355 1727096202.58963: worker is 1 (out of 1 available) 13355 1727096202.59180: exiting _queue_task() for managed_node3/package 13355 1727096202.59193: done queuing things up, now waiting for results queue to drain 13355 1727096202.59194: waiting for pending results... 13355 1727096202.59311: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13355 1727096202.59500: in run() - task 0afff68d-5257-c514-593f-000000000175 13355 1727096202.59521: variable 'ansible_search_path' from source: unknown 13355 1727096202.59648: variable 'ansible_search_path' from source: unknown 13355 1727096202.59651: calling self._execute() 13355 1727096202.59701: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096202.59712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096202.59728: variable 'omit' from source: magic vars 13355 1727096202.60159: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.60180: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096202.60318: variable 'network_state' from source: role '' defaults 13355 1727096202.60333: Evaluated conditional (network_state != {}): False 13355 1727096202.60339: when evaluation is False, skipping this task 13355 1727096202.60345: _execute() done 13355 1727096202.60352: dumping result to json 13355 1727096202.60361: done dumping result, returning 13355 1727096202.60377: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0afff68d-5257-c514-593f-000000000175] 13355 1727096202.60388: sending task result for task 0afff68d-5257-c514-593f-000000000175 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096202.60623: no more pending results, returning what we have 13355 1727096202.60628: results queue empty 13355 1727096202.60629: checking for any_errors_fatal 13355 1727096202.60636: done checking for any_errors_fatal 13355 1727096202.60637: checking for max_fail_percentage 13355 1727096202.60639: done checking for max_fail_percentage 13355 1727096202.60640: checking to see if all hosts have failed and the running result is not ok 13355 1727096202.60641: done checking to see if all hosts have failed 13355 1727096202.60641: getting the remaining hosts for this loop 13355 1727096202.60643: done getting the remaining hosts for this loop 13355 1727096202.60646: getting the next task for host managed_node3 13355 1727096202.60654: done getting next task for host managed_node3 13355 1727096202.60662: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096202.60666: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096202.60696: getting variables 13355 1727096202.60698: in VariableManager get_vars() 13355 1727096202.60767: Calling all_inventory to load vars for managed_node3 13355 1727096202.60973: Calling groups_inventory to load vars for managed_node3 13355 1727096202.60976: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096202.60987: Calling all_plugins_play to load vars for managed_node3 13355 1727096202.60990: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096202.60993: Calling groups_plugins_play to load vars for managed_node3 13355 1727096202.61681: done sending task result for task 0afff68d-5257-c514-593f-000000000175 13355 1727096202.61685: WORKER PROCESS EXITING 13355 1727096202.62551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096202.64021: done with get_vars() 13355 1727096202.64052: done getting variables 13355 1727096202.64118: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Monday 23 September 2024 08:56:42 -0400 (0:00:00.055) 0:00:51.902 ****** 13355 1727096202.64155: entering _queue_task() for managed_node3/package 13355 1727096202.64531: worker is 1 (out of 1 available) 13355 1727096202.64544: exiting _queue_task() for managed_node3/package 13355 1727096202.64559: done queuing things up, now waiting for results queue to drain 13355 1727096202.64561: waiting for pending results... 13355 1727096202.64858: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13355 1727096202.65031: in run() - task 0afff68d-5257-c514-593f-000000000176 13355 1727096202.65052: variable 'ansible_search_path' from source: unknown 13355 1727096202.65064: variable 'ansible_search_path' from source: unknown 13355 1727096202.65111: calling self._execute() 13355 1727096202.65223: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096202.65235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096202.65250: variable 'omit' from source: magic vars 13355 1727096202.65639: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.65659: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096202.65796: variable 'network_state' from source: role '' defaults 13355 1727096202.65812: Evaluated conditional (network_state != {}): False 13355 1727096202.65818: when evaluation is False, skipping this task 13355 1727096202.65824: _execute() done 13355 1727096202.65831: dumping result to json 13355 1727096202.65839: done dumping result, returning 13355 1727096202.65850: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0afff68d-5257-c514-593f-000000000176] 13355 1727096202.65863: sending task result for task 0afff68d-5257-c514-593f-000000000176 13355 1727096202.65989: done sending task result for task 0afff68d-5257-c514-593f-000000000176 13355 1727096202.65996: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096202.66052: no more pending results, returning what we have 13355 1727096202.66059: results queue empty 13355 1727096202.66060: checking for any_errors_fatal 13355 1727096202.66070: done checking for any_errors_fatal 13355 1727096202.66071: checking for max_fail_percentage 13355 1727096202.66072: done checking for max_fail_percentage 13355 1727096202.66073: checking to see if all hosts have failed and the running result is not ok 13355 1727096202.66074: done checking to see if all hosts have failed 13355 1727096202.66075: getting the remaining hosts for this loop 13355 1727096202.66076: done getting the remaining hosts for this loop 13355 1727096202.66080: getting the next task for host managed_node3 13355 1727096202.66090: done getting next task for host managed_node3 13355 1727096202.66094: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096202.66099: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096202.66126: getting variables 13355 1727096202.66128: in VariableManager get_vars() 13355 1727096202.66293: Calling all_inventory to load vars for managed_node3 13355 1727096202.66296: Calling groups_inventory to load vars for managed_node3 13355 1727096202.66299: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096202.66313: Calling all_plugins_play to load vars for managed_node3 13355 1727096202.66317: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096202.66320: Calling groups_plugins_play to load vars for managed_node3 13355 1727096202.67946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096202.70443: done with get_vars() 13355 1727096202.70493: done getting variables 13355 1727096202.70555: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Monday 23 September 2024 08:56:42 -0400 (0:00:00.064) 0:00:51.966 ****** 13355 1727096202.70598: entering _queue_task() for managed_node3/service 13355 1727096202.71083: worker is 1 (out of 1 available) 13355 1727096202.71095: exiting _queue_task() for managed_node3/service 13355 1727096202.71107: done queuing things up, now waiting for results queue to drain 13355 1727096202.71108: waiting for pending results... 13355 1727096202.71304: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13355 1727096202.71481: in run() - task 0afff68d-5257-c514-593f-000000000177 13355 1727096202.71548: variable 'ansible_search_path' from source: unknown 13355 1727096202.71552: variable 'ansible_search_path' from source: unknown 13355 1727096202.71555: calling self._execute() 13355 1727096202.71652: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096202.71669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096202.71690: variable 'omit' from source: magic vars 13355 1727096202.72078: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.72109: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096202.72234: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096202.72674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096202.78537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096202.78625: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096202.78695: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096202.78744: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096202.78796: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096202.78886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.78922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.78952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.79008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.79028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.79089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.79117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.79182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.79194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.79214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.79260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.79294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.79323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.79400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.79403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.79571: variable 'network_connections' from source: task vars 13355 1727096202.79591: variable 'controller_profile' from source: play vars 13355 1727096202.79674: variable 'controller_profile' from source: play vars 13355 1727096202.79836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096202.79940: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096202.79989: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096202.80060: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096202.80075: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096202.80121: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096202.80166: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096202.80188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.80215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096202.80274: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096202.80547: variable 'network_connections' from source: task vars 13355 1727096202.80550: variable 'controller_profile' from source: play vars 13355 1727096202.80819: variable 'controller_profile' from source: play vars 13355 1727096202.80823: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13355 1727096202.80825: when evaluation is False, skipping this task 13355 1727096202.80827: _execute() done 13355 1727096202.80829: dumping result to json 13355 1727096202.80831: done dumping result, returning 13355 1727096202.80833: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0afff68d-5257-c514-593f-000000000177] 13355 1727096202.80835: sending task result for task 0afff68d-5257-c514-593f-000000000177 13355 1727096202.81108: done sending task result for task 0afff68d-5257-c514-593f-000000000177 13355 1727096202.81119: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13355 1727096202.81194: no more pending results, returning what we have 13355 1727096202.81197: results queue empty 13355 1727096202.81198: checking for any_errors_fatal 13355 1727096202.81206: done checking for any_errors_fatal 13355 1727096202.81206: checking for max_fail_percentage 13355 1727096202.81208: done checking for max_fail_percentage 13355 1727096202.81209: checking to see if all hosts have failed and the running result is not ok 13355 1727096202.81210: done checking to see if all hosts have failed 13355 1727096202.81211: getting the remaining hosts for this loop 13355 1727096202.81212: done getting the remaining hosts for this loop 13355 1727096202.81216: getting the next task for host managed_node3 13355 1727096202.81224: done getting next task for host managed_node3 13355 1727096202.81228: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096202.81232: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096202.81255: getting variables 13355 1727096202.81260: in VariableManager get_vars() 13355 1727096202.81319: Calling all_inventory to load vars for managed_node3 13355 1727096202.81322: Calling groups_inventory to load vars for managed_node3 13355 1727096202.81324: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096202.81335: Calling all_plugins_play to load vars for managed_node3 13355 1727096202.81339: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096202.81342: Calling groups_plugins_play to load vars for managed_node3 13355 1727096202.85203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096202.87913: done with get_vars() 13355 1727096202.87963: done getting variables 13355 1727096202.88031: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Monday 23 September 2024 08:56:42 -0400 (0:00:00.174) 0:00:52.141 ****** 13355 1727096202.88074: entering _queue_task() for managed_node3/service 13355 1727096202.88597: worker is 1 (out of 1 available) 13355 1727096202.88608: exiting _queue_task() for managed_node3/service 13355 1727096202.88620: done queuing things up, now waiting for results queue to drain 13355 1727096202.88621: waiting for pending results... 13355 1727096202.89386: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13355 1727096202.89449: in run() - task 0afff68d-5257-c514-593f-000000000178 13355 1727096202.89473: variable 'ansible_search_path' from source: unknown 13355 1727096202.89875: variable 'ansible_search_path' from source: unknown 13355 1727096202.89879: calling self._execute() 13355 1727096202.89882: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096202.89884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096202.89887: variable 'omit' from source: magic vars 13355 1727096202.90609: variable 'ansible_distribution_major_version' from source: facts 13355 1727096202.90627: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096202.90967: variable 'network_provider' from source: set_fact 13355 1727096202.90996: variable 'network_state' from source: role '' defaults 13355 1727096202.91010: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13355 1727096202.91022: variable 'omit' from source: magic vars 13355 1727096202.91235: variable 'omit' from source: magic vars 13355 1727096202.91274: variable 'network_service_name' from source: role '' defaults 13355 1727096202.91343: variable 'network_service_name' from source: role '' defaults 13355 1727096202.91475: variable '__network_provider_setup' from source: role '' defaults 13355 1727096202.91741: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096202.91744: variable '__network_service_name_default_nm' from source: role '' defaults 13355 1727096202.91777: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096202.91840: variable '__network_packages_default_nm' from source: role '' defaults 13355 1727096202.92526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096202.96510: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096202.96598: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096202.96973: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096202.96977: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096202.96979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096202.96982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.96985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.97320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.97324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.97326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.97401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.97584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.97702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.97705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.97708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.98544: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13355 1727096202.98951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096202.98987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096202.99023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096202.99072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096202.99101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096202.99613: variable 'ansible_python' from source: facts 13355 1727096202.99693: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13355 1727096203.00136: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096203.00544: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096203.00990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096203.01060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096203.01209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096203.01280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096203.01283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096203.01413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096203.01554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096203.01687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096203.01877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096203.01892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096203.02260: variable 'network_connections' from source: task vars 13355 1727096203.02263: variable 'controller_profile' from source: play vars 13355 1727096203.02489: variable 'controller_profile' from source: play vars 13355 1727096203.02688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096203.02945: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096203.03041: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096203.03104: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096203.03151: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096203.03226: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096203.03263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096203.03310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096203.03348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096203.03407: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096203.03717: variable 'network_connections' from source: task vars 13355 1727096203.03734: variable 'controller_profile' from source: play vars 13355 1727096203.03818: variable 'controller_profile' from source: play vars 13355 1727096203.03868: variable '__network_packages_default_wireless' from source: role '' defaults 13355 1727096203.03953: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096203.04275: variable 'network_connections' from source: task vars 13355 1727096203.04373: variable 'controller_profile' from source: play vars 13355 1727096203.04377: variable 'controller_profile' from source: play vars 13355 1727096203.04380: variable '__network_packages_default_team' from source: role '' defaults 13355 1727096203.04462: variable '__network_team_connections_defined' from source: role '' defaults 13355 1727096203.04777: variable 'network_connections' from source: task vars 13355 1727096203.04788: variable 'controller_profile' from source: play vars 13355 1727096203.04863: variable 'controller_profile' from source: play vars 13355 1727096203.04927: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096203.05037: variable '__network_service_name_default_initscripts' from source: role '' defaults 13355 1727096203.05040: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096203.05074: variable '__network_packages_default_initscripts' from source: role '' defaults 13355 1727096203.05317: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13355 1727096203.06145: variable 'network_connections' from source: task vars 13355 1727096203.06172: variable 'controller_profile' from source: play vars 13355 1727096203.06281: variable 'controller_profile' from source: play vars 13355 1727096203.06298: variable 'ansible_distribution' from source: facts 13355 1727096203.06329: variable '__network_rh_distros' from source: role '' defaults 13355 1727096203.06373: variable 'ansible_distribution_major_version' from source: facts 13355 1727096203.06376: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13355 1727096203.06596: variable 'ansible_distribution' from source: facts 13355 1727096203.06609: variable '__network_rh_distros' from source: role '' defaults 13355 1727096203.06633: variable 'ansible_distribution_major_version' from source: facts 13355 1727096203.06781: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13355 1727096203.06904: variable 'ansible_distribution' from source: facts 13355 1727096203.06925: variable '__network_rh_distros' from source: role '' defaults 13355 1727096203.06934: variable 'ansible_distribution_major_version' from source: facts 13355 1727096203.06995: variable 'network_provider' from source: set_fact 13355 1727096203.07053: variable 'omit' from source: magic vars 13355 1727096203.07203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096203.07209: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096203.07212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096203.07214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096203.07217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096203.07268: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096203.07281: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096203.07291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096203.07452: Set connection var ansible_shell_executable to /bin/sh 13355 1727096203.07466: Set connection var ansible_shell_type to sh 13355 1727096203.07479: Set connection var ansible_pipelining to False 13355 1727096203.07487: Set connection var ansible_connection to ssh 13355 1727096203.07495: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096203.07503: Set connection var ansible_timeout to 10 13355 1727096203.07543: variable 'ansible_shell_executable' from source: unknown 13355 1727096203.07562: variable 'ansible_connection' from source: unknown 13355 1727096203.07571: variable 'ansible_module_compression' from source: unknown 13355 1727096203.07617: variable 'ansible_shell_type' from source: unknown 13355 1727096203.07620: variable 'ansible_shell_executable' from source: unknown 13355 1727096203.07622: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096203.07626: variable 'ansible_pipelining' from source: unknown 13355 1727096203.07628: variable 'ansible_timeout' from source: unknown 13355 1727096203.07638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096203.07870: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096203.07881: variable 'omit' from source: magic vars 13355 1727096203.07883: starting attempt loop 13355 1727096203.07885: running the handler 13355 1727096203.07930: variable 'ansible_facts' from source: unknown 13355 1727096203.08861: _low_level_execute_command(): starting 13355 1727096203.08875: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096203.09683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.09791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.09819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.09840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.09918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.11556: stdout chunk (state=3): >>>/root <<< 13355 1727096203.11642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.11676: stderr chunk (state=3): >>><<< 13355 1727096203.11679: stdout chunk (state=3): >>><<< 13355 1727096203.11702: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096203.11712: _low_level_execute_command(): starting 13355 1727096203.11718: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610 `" && echo ansible-tmp-1727096203.1170225-15630-240682350741610="` echo /root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610 `" ) && sleep 0' 13355 1727096203.12179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096203.12208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.12211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096203.12213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.12261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.12264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.12312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.14237: stdout chunk (state=3): >>>ansible-tmp-1727096203.1170225-15630-240682350741610=/root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610 <<< 13355 1727096203.14387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.14390: stdout chunk (state=3): >>><<< 13355 1727096203.14392: stderr chunk (state=3): >>><<< 13355 1727096203.14395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096203.1170225-15630-240682350741610=/root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096203.14416: variable 'ansible_module_compression' from source: unknown 13355 1727096203.14474: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13355 1727096203.14535: variable 'ansible_facts' from source: unknown 13355 1727096203.14774: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/AnsiballZ_systemd.py 13355 1727096203.15022: Sending initial data 13355 1727096203.15025: Sent initial data (156 bytes) 13355 1727096203.15531: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096203.15539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096203.15550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096203.15564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096203.15634: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.15747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.15773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.15888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.15955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.17522: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 13355 1727096203.17526: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096203.17554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096203.17585: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpe7hr0sc9 /root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/AnsiballZ_systemd.py <<< 13355 1727096203.17592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/AnsiballZ_systemd.py" <<< 13355 1727096203.17616: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpe7hr0sc9" to remote "/root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/AnsiballZ_systemd.py" <<< 13355 1727096203.17622: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/AnsiballZ_systemd.py" <<< 13355 1727096203.19417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.19420: stdout chunk (state=3): >>><<< 13355 1727096203.19423: stderr chunk (state=3): >>><<< 13355 1727096203.19496: done transferring module to remote 13355 1727096203.19499: _low_level_execute_command(): starting 13355 1727096203.19502: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/ /root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/AnsiballZ_systemd.py && sleep 0' 13355 1727096203.20172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096203.20177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096203.20180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.20183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096203.20185: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096203.20188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096203.20190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.20234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.20248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.20251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.20298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.22102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.22129: stderr chunk (state=3): >>><<< 13355 1727096203.22133: stdout chunk (state=3): >>><<< 13355 1727096203.22146: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096203.22149: _low_level_execute_command(): starting 13355 1727096203.22154: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/AnsiballZ_systemd.py && sleep 0' 13355 1727096203.22997: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.23010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.23075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.23146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.52904: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10592256", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306950656", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1336343000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 13355 1727096203.52946: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13355 1727096203.55093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096203.55098: stdout chunk (state=3): >>><<< 13355 1727096203.55177: stderr chunk (state=3): >>><<< 13355 1727096203.55183: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainStartTimestampMonotonic": "22098154", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ExecMainHandoffTimestampMonotonic": "22114950", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10592256", "MemoryPeak": "14127104", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3306950656", "EffectiveMemoryMax": "3702882304", "EffectiveMemoryHigh": "3702882304", "CPUUsageNSec": "1336343000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service NetworkManager-wait-online.service network.target multi-user.target", "After": "systemd-journald.socket basic.target system.slice sysinit.target dbus.socket dbus-broker.service cloud-init-local.service network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Mon 2024-09-23 08:54:46 EDT", "StateChangeTimestampMonotonic": "229668491", "InactiveExitTimestamp": "Mon 2024-09-23 08:51:17 EDT", "InactiveExitTimestampMonotonic": "22098658", "ActiveEnterTimestamp": "Mon 2024-09-23 08:51:18 EDT", "ActiveEnterTimestampMonotonic": "22578647", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Mon 2024-09-23 08:51:17 EDT", "ConditionTimestampMonotonic": "22097143", "AssertTimestamp": "Mon 2024-09-23 08:51:17 EDT", "AssertTimestampMonotonic": "22097145", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "795485e52798420593bcdae791c1fbbf", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096203.55329: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096203.55334: _low_level_execute_command(): starting 13355 1727096203.55345: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096203.1170225-15630-240682350741610/ > /dev/null 2>&1 && sleep 0' 13355 1727096203.55951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096203.55986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.56019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.56109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.57966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.57996: stderr chunk (state=3): >>><<< 13355 1727096203.57999: stdout chunk (state=3): >>><<< 13355 1727096203.58013: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096203.58020: handler run complete 13355 1727096203.58063: attempt loop complete, returning result 13355 1727096203.58066: _execute() done 13355 1727096203.58074: dumping result to json 13355 1727096203.58088: done dumping result, returning 13355 1727096203.58098: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0afff68d-5257-c514-593f-000000000178] 13355 1727096203.58102: sending task result for task 0afff68d-5257-c514-593f-000000000178 13355 1727096203.58343: done sending task result for task 0afff68d-5257-c514-593f-000000000178 13355 1727096203.58346: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096203.58406: no more pending results, returning what we have 13355 1727096203.58409: results queue empty 13355 1727096203.58410: checking for any_errors_fatal 13355 1727096203.58415: done checking for any_errors_fatal 13355 1727096203.58415: checking for max_fail_percentage 13355 1727096203.58417: done checking for max_fail_percentage 13355 1727096203.58417: checking to see if all hosts have failed and the running result is not ok 13355 1727096203.58418: done checking to see if all hosts have failed 13355 1727096203.58419: getting the remaining hosts for this loop 13355 1727096203.58420: done getting the remaining hosts for this loop 13355 1727096203.58423: getting the next task for host managed_node3 13355 1727096203.58430: done getting next task for host managed_node3 13355 1727096203.58433: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096203.58436: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096203.58448: getting variables 13355 1727096203.58449: in VariableManager get_vars() 13355 1727096203.58493: Calling all_inventory to load vars for managed_node3 13355 1727096203.58495: Calling groups_inventory to load vars for managed_node3 13355 1727096203.58497: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096203.58506: Calling all_plugins_play to load vars for managed_node3 13355 1727096203.58509: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096203.58511: Calling groups_plugins_play to load vars for managed_node3 13355 1727096203.59629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096203.60773: done with get_vars() 13355 1727096203.60802: done getting variables 13355 1727096203.60846: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Monday 23 September 2024 08:56:43 -0400 (0:00:00.728) 0:00:52.869 ****** 13355 1727096203.60876: entering _queue_task() for managed_node3/service 13355 1727096203.61154: worker is 1 (out of 1 available) 13355 1727096203.61173: exiting _queue_task() for managed_node3/service 13355 1727096203.61186: done queuing things up, now waiting for results queue to drain 13355 1727096203.61187: waiting for pending results... 13355 1727096203.61363: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13355 1727096203.61471: in run() - task 0afff68d-5257-c514-593f-000000000179 13355 1727096203.61484: variable 'ansible_search_path' from source: unknown 13355 1727096203.61487: variable 'ansible_search_path' from source: unknown 13355 1727096203.61517: calling self._execute() 13355 1727096203.61594: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096203.61598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096203.61606: variable 'omit' from source: magic vars 13355 1727096203.61900: variable 'ansible_distribution_major_version' from source: facts 13355 1727096203.61912: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096203.61996: variable 'network_provider' from source: set_fact 13355 1727096203.62000: Evaluated conditional (network_provider == "nm"): True 13355 1727096203.62076: variable '__network_wpa_supplicant_required' from source: role '' defaults 13355 1727096203.62133: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13355 1727096203.62252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096203.63710: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096203.63760: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096203.63787: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096203.63815: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096203.63837: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096203.63908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096203.63931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096203.63951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096203.63980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096203.63992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096203.64027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096203.64051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096203.64064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096203.64090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096203.64101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096203.64128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096203.64145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096203.64166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096203.64194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096203.64204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096203.64310: variable 'network_connections' from source: task vars 13355 1727096203.64320: variable 'controller_profile' from source: play vars 13355 1727096203.64380: variable 'controller_profile' from source: play vars 13355 1727096203.64426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13355 1727096203.64541: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13355 1727096203.64571: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13355 1727096203.64596: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13355 1727096203.64620: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13355 1727096203.64654: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13355 1727096203.64671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13355 1727096203.64689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096203.64712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13355 1727096203.64749: variable '__network_wireless_connections_defined' from source: role '' defaults 13355 1727096203.64921: variable 'network_connections' from source: task vars 13355 1727096203.64926: variable 'controller_profile' from source: play vars 13355 1727096203.64971: variable 'controller_profile' from source: play vars 13355 1727096203.64995: Evaluated conditional (__network_wpa_supplicant_required): False 13355 1727096203.64998: when evaluation is False, skipping this task 13355 1727096203.65001: _execute() done 13355 1727096203.65003: dumping result to json 13355 1727096203.65005: done dumping result, returning 13355 1727096203.65015: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0afff68d-5257-c514-593f-000000000179] 13355 1727096203.65028: sending task result for task 0afff68d-5257-c514-593f-000000000179 13355 1727096203.65111: done sending task result for task 0afff68d-5257-c514-593f-000000000179 13355 1727096203.65113: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13355 1727096203.65191: no more pending results, returning what we have 13355 1727096203.65195: results queue empty 13355 1727096203.65195: checking for any_errors_fatal 13355 1727096203.65224: done checking for any_errors_fatal 13355 1727096203.65225: checking for max_fail_percentage 13355 1727096203.65229: done checking for max_fail_percentage 13355 1727096203.65229: checking to see if all hosts have failed and the running result is not ok 13355 1727096203.65230: done checking to see if all hosts have failed 13355 1727096203.65230: getting the remaining hosts for this loop 13355 1727096203.65232: done getting the remaining hosts for this loop 13355 1727096203.65235: getting the next task for host managed_node3 13355 1727096203.65242: done getting next task for host managed_node3 13355 1727096203.65248: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096203.65252: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096203.65275: getting variables 13355 1727096203.65277: in VariableManager get_vars() 13355 1727096203.65321: Calling all_inventory to load vars for managed_node3 13355 1727096203.65324: Calling groups_inventory to load vars for managed_node3 13355 1727096203.65326: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096203.65335: Calling all_plugins_play to load vars for managed_node3 13355 1727096203.65338: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096203.65340: Calling groups_plugins_play to load vars for managed_node3 13355 1727096203.66152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096203.67035: done with get_vars() 13355 1727096203.67059: done getting variables 13355 1727096203.67106: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Monday 23 September 2024 08:56:43 -0400 (0:00:00.062) 0:00:52.931 ****** 13355 1727096203.67133: entering _queue_task() for managed_node3/service 13355 1727096203.67415: worker is 1 (out of 1 available) 13355 1727096203.67429: exiting _queue_task() for managed_node3/service 13355 1727096203.67442: done queuing things up, now waiting for results queue to drain 13355 1727096203.67444: waiting for pending results... 13355 1727096203.67629: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 13355 1727096203.67733: in run() - task 0afff68d-5257-c514-593f-00000000017a 13355 1727096203.67744: variable 'ansible_search_path' from source: unknown 13355 1727096203.67747: variable 'ansible_search_path' from source: unknown 13355 1727096203.67787: calling self._execute() 13355 1727096203.67864: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096203.67871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096203.67878: variable 'omit' from source: magic vars 13355 1727096203.68180: variable 'ansible_distribution_major_version' from source: facts 13355 1727096203.68189: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096203.68276: variable 'network_provider' from source: set_fact 13355 1727096203.68280: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096203.68283: when evaluation is False, skipping this task 13355 1727096203.68285: _execute() done 13355 1727096203.68289: dumping result to json 13355 1727096203.68292: done dumping result, returning 13355 1727096203.68300: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [0afff68d-5257-c514-593f-00000000017a] 13355 1727096203.68304: sending task result for task 0afff68d-5257-c514-593f-00000000017a 13355 1727096203.68395: done sending task result for task 0afff68d-5257-c514-593f-00000000017a 13355 1727096203.68397: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13355 1727096203.68472: no more pending results, returning what we have 13355 1727096203.68475: results queue empty 13355 1727096203.68476: checking for any_errors_fatal 13355 1727096203.68484: done checking for any_errors_fatal 13355 1727096203.68485: checking for max_fail_percentage 13355 1727096203.68487: done checking for max_fail_percentage 13355 1727096203.68488: checking to see if all hosts have failed and the running result is not ok 13355 1727096203.68488: done checking to see if all hosts have failed 13355 1727096203.68489: getting the remaining hosts for this loop 13355 1727096203.68490: done getting the remaining hosts for this loop 13355 1727096203.68494: getting the next task for host managed_node3 13355 1727096203.68502: done getting next task for host managed_node3 13355 1727096203.68507: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096203.68511: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096203.68534: getting variables 13355 1727096203.68536: in VariableManager get_vars() 13355 1727096203.68584: Calling all_inventory to load vars for managed_node3 13355 1727096203.68586: Calling groups_inventory to load vars for managed_node3 13355 1727096203.68588: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096203.68597: Calling all_plugins_play to load vars for managed_node3 13355 1727096203.68600: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096203.68602: Calling groups_plugins_play to load vars for managed_node3 13355 1727096203.69541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096203.70424: done with get_vars() 13355 1727096203.70450: done getting variables 13355 1727096203.70500: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Monday 23 September 2024 08:56:43 -0400 (0:00:00.033) 0:00:52.965 ****** 13355 1727096203.70529: entering _queue_task() for managed_node3/copy 13355 1727096203.70816: worker is 1 (out of 1 available) 13355 1727096203.70829: exiting _queue_task() for managed_node3/copy 13355 1727096203.70842: done queuing things up, now waiting for results queue to drain 13355 1727096203.70844: waiting for pending results... 13355 1727096203.71029: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13355 1727096203.71124: in run() - task 0afff68d-5257-c514-593f-00000000017b 13355 1727096203.71135: variable 'ansible_search_path' from source: unknown 13355 1727096203.71139: variable 'ansible_search_path' from source: unknown 13355 1727096203.71171: calling self._execute() 13355 1727096203.71258: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096203.71263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096203.71271: variable 'omit' from source: magic vars 13355 1727096203.71563: variable 'ansible_distribution_major_version' from source: facts 13355 1727096203.71574: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096203.71660: variable 'network_provider' from source: set_fact 13355 1727096203.71664: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096203.71666: when evaluation is False, skipping this task 13355 1727096203.71672: _execute() done 13355 1727096203.71674: dumping result to json 13355 1727096203.71676: done dumping result, returning 13355 1727096203.71684: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0afff68d-5257-c514-593f-00000000017b] 13355 1727096203.71689: sending task result for task 0afff68d-5257-c514-593f-00000000017b 13355 1727096203.71786: done sending task result for task 0afff68d-5257-c514-593f-00000000017b 13355 1727096203.71788: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096203.71835: no more pending results, returning what we have 13355 1727096203.71840: results queue empty 13355 1727096203.71840: checking for any_errors_fatal 13355 1727096203.71847: done checking for any_errors_fatal 13355 1727096203.71847: checking for max_fail_percentage 13355 1727096203.71849: done checking for max_fail_percentage 13355 1727096203.71850: checking to see if all hosts have failed and the running result is not ok 13355 1727096203.71851: done checking to see if all hosts have failed 13355 1727096203.71851: getting the remaining hosts for this loop 13355 1727096203.71852: done getting the remaining hosts for this loop 13355 1727096203.71858: getting the next task for host managed_node3 13355 1727096203.71865: done getting next task for host managed_node3 13355 1727096203.71871: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096203.71875: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096203.71900: getting variables 13355 1727096203.71902: in VariableManager get_vars() 13355 1727096203.71949: Calling all_inventory to load vars for managed_node3 13355 1727096203.71951: Calling groups_inventory to load vars for managed_node3 13355 1727096203.71953: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096203.71966: Calling all_plugins_play to load vars for managed_node3 13355 1727096203.71979: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096203.71982: Calling groups_plugins_play to load vars for managed_node3 13355 1727096203.72802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096203.73807: done with get_vars() 13355 1727096203.73829: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Monday 23 September 2024 08:56:43 -0400 (0:00:00.033) 0:00:52.999 ****** 13355 1727096203.73897: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096203.74179: worker is 1 (out of 1 available) 13355 1727096203.74194: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 13355 1727096203.74206: done queuing things up, now waiting for results queue to drain 13355 1727096203.74208: waiting for pending results... 13355 1727096203.74392: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13355 1727096203.74495: in run() - task 0afff68d-5257-c514-593f-00000000017c 13355 1727096203.74508: variable 'ansible_search_path' from source: unknown 13355 1727096203.74511: variable 'ansible_search_path' from source: unknown 13355 1727096203.74546: calling self._execute() 13355 1727096203.74624: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096203.74628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096203.74637: variable 'omit' from source: magic vars 13355 1727096203.74929: variable 'ansible_distribution_major_version' from source: facts 13355 1727096203.74939: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096203.74944: variable 'omit' from source: magic vars 13355 1727096203.74999: variable 'omit' from source: magic vars 13355 1727096203.75122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13355 1727096203.76634: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13355 1727096203.76683: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13355 1727096203.76710: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13355 1727096203.76739: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13355 1727096203.76763: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13355 1727096203.76831: variable 'network_provider' from source: set_fact 13355 1727096203.76928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13355 1727096203.76960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13355 1727096203.76982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13355 1727096203.77007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13355 1727096203.77019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13355 1727096203.77079: variable 'omit' from source: magic vars 13355 1727096203.77156: variable 'omit' from source: magic vars 13355 1727096203.77228: variable 'network_connections' from source: task vars 13355 1727096203.77237: variable 'controller_profile' from source: play vars 13355 1727096203.77286: variable 'controller_profile' from source: play vars 13355 1727096203.77390: variable 'omit' from source: magic vars 13355 1727096203.77398: variable '__lsr_ansible_managed' from source: task vars 13355 1727096203.77440: variable '__lsr_ansible_managed' from source: task vars 13355 1727096203.77566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13355 1727096203.77721: Loaded config def from plugin (lookup/template) 13355 1727096203.77724: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13355 1727096203.77746: File lookup term: get_ansible_managed.j2 13355 1727096203.77749: variable 'ansible_search_path' from source: unknown 13355 1727096203.77753: evaluation_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13355 1727096203.77766: search_path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13355 1727096203.77783: variable 'ansible_search_path' from source: unknown 13355 1727096203.81314: variable 'ansible_managed' from source: unknown 13355 1727096203.81403: variable 'omit' from source: magic vars 13355 1727096203.81424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096203.81445: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096203.81458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096203.81476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096203.81487: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096203.81509: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096203.81512: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096203.81514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096203.81577: Set connection var ansible_shell_executable to /bin/sh 13355 1727096203.81581: Set connection var ansible_shell_type to sh 13355 1727096203.81588: Set connection var ansible_pipelining to False 13355 1727096203.81590: Set connection var ansible_connection to ssh 13355 1727096203.81601: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096203.81603: Set connection var ansible_timeout to 10 13355 1727096203.81621: variable 'ansible_shell_executable' from source: unknown 13355 1727096203.81624: variable 'ansible_connection' from source: unknown 13355 1727096203.81626: variable 'ansible_module_compression' from source: unknown 13355 1727096203.81629: variable 'ansible_shell_type' from source: unknown 13355 1727096203.81631: variable 'ansible_shell_executable' from source: unknown 13355 1727096203.81634: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096203.81636: variable 'ansible_pipelining' from source: unknown 13355 1727096203.81640: variable 'ansible_timeout' from source: unknown 13355 1727096203.81644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096203.81743: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096203.81755: variable 'omit' from source: magic vars 13355 1727096203.81758: starting attempt loop 13355 1727096203.81763: running the handler 13355 1727096203.81777: _low_level_execute_command(): starting 13355 1727096203.81783: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096203.82272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096203.82301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.82304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096203.82307: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.82348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.82383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.82409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.84097: stdout chunk (state=3): >>>/root <<< 13355 1727096203.84188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.84223: stderr chunk (state=3): >>><<< 13355 1727096203.84226: stdout chunk (state=3): >>><<< 13355 1727096203.84248: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096203.84262: _low_level_execute_command(): starting 13355 1727096203.84269: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911 `" && echo ansible-tmp-1727096203.8424845-15676-103934854564911="` echo /root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911 `" ) && sleep 0' 13355 1727096203.84721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096203.84725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.84746: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.84798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.84802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.84804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.84843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.86811: stdout chunk (state=3): >>>ansible-tmp-1727096203.8424845-15676-103934854564911=/root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911 <<< 13355 1727096203.86916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.86939: stderr chunk (state=3): >>><<< 13355 1727096203.86943: stdout chunk (state=3): >>><<< 13355 1727096203.86961: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096203.8424845-15676-103934854564911=/root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096203.87001: variable 'ansible_module_compression' from source: unknown 13355 1727096203.87044: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13355 1727096203.87086: variable 'ansible_facts' from source: unknown 13355 1727096203.87181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/AnsiballZ_network_connections.py 13355 1727096203.87289: Sending initial data 13355 1727096203.87292: Sent initial data (168 bytes) 13355 1727096203.87726: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096203.87732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096203.87761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.87778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096203.87781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096203.87783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.87824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.87827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.87829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.87873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.89465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096203.89495: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096203.89526: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmptj93_fur /root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/AnsiballZ_network_connections.py <<< 13355 1727096203.89533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/AnsiballZ_network_connections.py" <<< 13355 1727096203.89561: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmptj93_fur" to remote "/root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/AnsiballZ_network_connections.py" <<< 13355 1727096203.89563: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/AnsiballZ_network_connections.py" <<< 13355 1727096203.90236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.90283: stderr chunk (state=3): >>><<< 13355 1727096203.90287: stdout chunk (state=3): >>><<< 13355 1727096203.90304: done transferring module to remote 13355 1727096203.90313: _low_level_execute_command(): starting 13355 1727096203.90318: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/ /root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/AnsiballZ_network_connections.py && sleep 0' 13355 1727096203.90782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096203.90786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096203.90788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.90790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096203.90792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096203.90796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.90846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.90850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.90854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.90887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096203.92683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096203.92708: stderr chunk (state=3): >>><<< 13355 1727096203.92712: stdout chunk (state=3): >>><<< 13355 1727096203.92728: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096203.92731: _low_level_execute_command(): starting 13355 1727096203.92736: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/AnsiballZ_network_connections.py && sleep 0' 13355 1727096203.93203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096203.93207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.93210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096203.93212: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096203.93260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096203.93264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096203.93271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096203.93309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096204.32081: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7kz22rth/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7kz22rth/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a55cc535-e38f-4547-bb0f-3479e284a0c7: error=unknown <<< 13355 1727096204.32219: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13355 1727096204.34278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096204.34306: stderr chunk (state=3): >>><<< 13355 1727096204.34310: stdout chunk (state=3): >>><<< 13355 1727096204.34331: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7kz22rth/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_7kz22rth/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a55cc535-e38f-4547-bb0f-3479e284a0c7: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "down", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096204.34362: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'down', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096204.34369: _low_level_execute_command(): starting 13355 1727096204.34375: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096203.8424845-15676-103934854564911/ > /dev/null 2>&1 && sleep 0' 13355 1727096204.34843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096204.34846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.34848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096204.34851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096204.34853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.34901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096204.34913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096204.34958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096204.36870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096204.36897: stderr chunk (state=3): >>><<< 13355 1727096204.36900: stdout chunk (state=3): >>><<< 13355 1727096204.36921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096204.36929: handler run complete 13355 1727096204.36948: attempt loop complete, returning result 13355 1727096204.36950: _execute() done 13355 1727096204.36953: dumping result to json 13355 1727096204.36958: done dumping result, returning 13355 1727096204.36971: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0afff68d-5257-c514-593f-00000000017c] 13355 1727096204.36976: sending task result for task 0afff68d-5257-c514-593f-00000000017c 13355 1727096204.37078: done sending task result for task 0afff68d-5257-c514-593f-00000000017c 13355 1727096204.37081: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13355 1727096204.37181: no more pending results, returning what we have 13355 1727096204.37185: results queue empty 13355 1727096204.37186: checking for any_errors_fatal 13355 1727096204.37201: done checking for any_errors_fatal 13355 1727096204.37202: checking for max_fail_percentage 13355 1727096204.37204: done checking for max_fail_percentage 13355 1727096204.37205: checking to see if all hosts have failed and the running result is not ok 13355 1727096204.37205: done checking to see if all hosts have failed 13355 1727096204.37206: getting the remaining hosts for this loop 13355 1727096204.37207: done getting the remaining hosts for this loop 13355 1727096204.37210: getting the next task for host managed_node3 13355 1727096204.37217: done getting next task for host managed_node3 13355 1727096204.37221: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096204.37224: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096204.37236: getting variables 13355 1727096204.37237: in VariableManager get_vars() 13355 1727096204.37287: Calling all_inventory to load vars for managed_node3 13355 1727096204.37290: Calling groups_inventory to load vars for managed_node3 13355 1727096204.37292: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096204.37306: Calling all_plugins_play to load vars for managed_node3 13355 1727096204.37309: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096204.37311: Calling groups_plugins_play to load vars for managed_node3 13355 1727096204.38149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096204.39037: done with get_vars() 13355 1727096204.39062: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Monday 23 September 2024 08:56:44 -0400 (0:00:00.652) 0:00:53.651 ****** 13355 1727096204.39130: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096204.39411: worker is 1 (out of 1 available) 13355 1727096204.39423: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 13355 1727096204.39437: done queuing things up, now waiting for results queue to drain 13355 1727096204.39438: waiting for pending results... 13355 1727096204.39615: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 13355 1727096204.39713: in run() - task 0afff68d-5257-c514-593f-00000000017d 13355 1727096204.39725: variable 'ansible_search_path' from source: unknown 13355 1727096204.39728: variable 'ansible_search_path' from source: unknown 13355 1727096204.39761: calling self._execute() 13355 1727096204.39841: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.39845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.39853: variable 'omit' from source: magic vars 13355 1727096204.40172: variable 'ansible_distribution_major_version' from source: facts 13355 1727096204.40182: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096204.40273: variable 'network_state' from source: role '' defaults 13355 1727096204.40282: Evaluated conditional (network_state != {}): False 13355 1727096204.40285: when evaluation is False, skipping this task 13355 1727096204.40288: _execute() done 13355 1727096204.40291: dumping result to json 13355 1727096204.40293: done dumping result, returning 13355 1727096204.40300: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [0afff68d-5257-c514-593f-00000000017d] 13355 1727096204.40305: sending task result for task 0afff68d-5257-c514-593f-00000000017d 13355 1727096204.40400: done sending task result for task 0afff68d-5257-c514-593f-00000000017d 13355 1727096204.40403: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13355 1727096204.40473: no more pending results, returning what we have 13355 1727096204.40478: results queue empty 13355 1727096204.40478: checking for any_errors_fatal 13355 1727096204.40488: done checking for any_errors_fatal 13355 1727096204.40489: checking for max_fail_percentage 13355 1727096204.40491: done checking for max_fail_percentage 13355 1727096204.40491: checking to see if all hosts have failed and the running result is not ok 13355 1727096204.40492: done checking to see if all hosts have failed 13355 1727096204.40493: getting the remaining hosts for this loop 13355 1727096204.40494: done getting the remaining hosts for this loop 13355 1727096204.40497: getting the next task for host managed_node3 13355 1727096204.40505: done getting next task for host managed_node3 13355 1727096204.40510: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096204.40514: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096204.40537: getting variables 13355 1727096204.40538: in VariableManager get_vars() 13355 1727096204.40588: Calling all_inventory to load vars for managed_node3 13355 1727096204.40591: Calling groups_inventory to load vars for managed_node3 13355 1727096204.40593: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096204.40603: Calling all_plugins_play to load vars for managed_node3 13355 1727096204.40605: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096204.40608: Calling groups_plugins_play to load vars for managed_node3 13355 1727096204.41563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096204.43005: done with get_vars() 13355 1727096204.43033: done getting variables 13355 1727096204.43088: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Monday 23 September 2024 08:56:44 -0400 (0:00:00.039) 0:00:53.691 ****** 13355 1727096204.43116: entering _queue_task() for managed_node3/debug 13355 1727096204.43402: worker is 1 (out of 1 available) 13355 1727096204.43416: exiting _queue_task() for managed_node3/debug 13355 1727096204.43428: done queuing things up, now waiting for results queue to drain 13355 1727096204.43429: waiting for pending results... 13355 1727096204.43617: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13355 1727096204.43717: in run() - task 0afff68d-5257-c514-593f-00000000017e 13355 1727096204.43730: variable 'ansible_search_path' from source: unknown 13355 1727096204.43734: variable 'ansible_search_path' from source: unknown 13355 1727096204.43776: calling self._execute() 13355 1727096204.43845: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.43849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.43860: variable 'omit' from source: magic vars 13355 1727096204.44145: variable 'ansible_distribution_major_version' from source: facts 13355 1727096204.44155: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096204.44161: variable 'omit' from source: magic vars 13355 1727096204.44212: variable 'omit' from source: magic vars 13355 1727096204.44240: variable 'omit' from source: magic vars 13355 1727096204.44275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096204.44305: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096204.44322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096204.44335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096204.44345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096204.44371: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096204.44375: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.44378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.44449: Set connection var ansible_shell_executable to /bin/sh 13355 1727096204.44453: Set connection var ansible_shell_type to sh 13355 1727096204.44461: Set connection var ansible_pipelining to False 13355 1727096204.44464: Set connection var ansible_connection to ssh 13355 1727096204.44469: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096204.44474: Set connection var ansible_timeout to 10 13355 1727096204.44494: variable 'ansible_shell_executable' from source: unknown 13355 1727096204.44497: variable 'ansible_connection' from source: unknown 13355 1727096204.44500: variable 'ansible_module_compression' from source: unknown 13355 1727096204.44502: variable 'ansible_shell_type' from source: unknown 13355 1727096204.44504: variable 'ansible_shell_executable' from source: unknown 13355 1727096204.44506: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.44508: variable 'ansible_pipelining' from source: unknown 13355 1727096204.44511: variable 'ansible_timeout' from source: unknown 13355 1727096204.44515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.44675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096204.44679: variable 'omit' from source: magic vars 13355 1727096204.44682: starting attempt loop 13355 1727096204.44684: running the handler 13355 1727096204.45177: variable '__network_connections_result' from source: set_fact 13355 1727096204.45180: handler run complete 13355 1727096204.45182: attempt loop complete, returning result 13355 1727096204.45184: _execute() done 13355 1727096204.45186: dumping result to json 13355 1727096204.45188: done dumping result, returning 13355 1727096204.45190: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0afff68d-5257-c514-593f-00000000017e] 13355 1727096204.45192: sending task result for task 0afff68d-5257-c514-593f-00000000017e 13355 1727096204.45259: done sending task result for task 0afff68d-5257-c514-593f-00000000017e 13355 1727096204.45262: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 13355 1727096204.45335: no more pending results, returning what we have 13355 1727096204.45340: results queue empty 13355 1727096204.45341: checking for any_errors_fatal 13355 1727096204.45348: done checking for any_errors_fatal 13355 1727096204.45349: checking for max_fail_percentage 13355 1727096204.45350: done checking for max_fail_percentage 13355 1727096204.45351: checking to see if all hosts have failed and the running result is not ok 13355 1727096204.45352: done checking to see if all hosts have failed 13355 1727096204.45352: getting the remaining hosts for this loop 13355 1727096204.45353: done getting the remaining hosts for this loop 13355 1727096204.45360: getting the next task for host managed_node3 13355 1727096204.45370: done getting next task for host managed_node3 13355 1727096204.45374: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096204.45378: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096204.45391: getting variables 13355 1727096204.45392: in VariableManager get_vars() 13355 1727096204.45436: Calling all_inventory to load vars for managed_node3 13355 1727096204.45439: Calling groups_inventory to load vars for managed_node3 13355 1727096204.45441: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096204.45449: Calling all_plugins_play to load vars for managed_node3 13355 1727096204.45451: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096204.45453: Calling groups_plugins_play to load vars for managed_node3 13355 1727096204.47108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096204.49154: done with get_vars() 13355 1727096204.49187: done getting variables 13355 1727096204.49233: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Monday 23 September 2024 08:56:44 -0400 (0:00:00.061) 0:00:53.753 ****** 13355 1727096204.49262: entering _queue_task() for managed_node3/debug 13355 1727096204.49539: worker is 1 (out of 1 available) 13355 1727096204.49551: exiting _queue_task() for managed_node3/debug 13355 1727096204.49568: done queuing things up, now waiting for results queue to drain 13355 1727096204.49570: waiting for pending results... 13355 1727096204.49751: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13355 1727096204.49859: in run() - task 0afff68d-5257-c514-593f-00000000017f 13355 1727096204.49871: variable 'ansible_search_path' from source: unknown 13355 1727096204.49875: variable 'ansible_search_path' from source: unknown 13355 1727096204.49907: calling self._execute() 13355 1727096204.49985: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.49988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.49997: variable 'omit' from source: magic vars 13355 1727096204.50301: variable 'ansible_distribution_major_version' from source: facts 13355 1727096204.50315: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096204.50321: variable 'omit' from source: magic vars 13355 1727096204.50385: variable 'omit' from source: magic vars 13355 1727096204.50482: variable 'omit' from source: magic vars 13355 1727096204.50486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096204.50594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096204.50599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096204.50602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096204.50604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096204.50606: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096204.50608: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.50610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.50872: Set connection var ansible_shell_executable to /bin/sh 13355 1727096204.50876: Set connection var ansible_shell_type to sh 13355 1727096204.50878: Set connection var ansible_pipelining to False 13355 1727096204.50881: Set connection var ansible_connection to ssh 13355 1727096204.50883: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096204.50885: Set connection var ansible_timeout to 10 13355 1727096204.50887: variable 'ansible_shell_executable' from source: unknown 13355 1727096204.50889: variable 'ansible_connection' from source: unknown 13355 1727096204.50892: variable 'ansible_module_compression' from source: unknown 13355 1727096204.50894: variable 'ansible_shell_type' from source: unknown 13355 1727096204.50896: variable 'ansible_shell_executable' from source: unknown 13355 1727096204.50899: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.50901: variable 'ansible_pipelining' from source: unknown 13355 1727096204.50903: variable 'ansible_timeout' from source: unknown 13355 1727096204.50905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.50907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096204.50936: variable 'omit' from source: magic vars 13355 1727096204.50939: starting attempt loop 13355 1727096204.50942: running the handler 13355 1727096204.50982: variable '__network_connections_result' from source: set_fact 13355 1727096204.51058: variable '__network_connections_result' from source: set_fact 13355 1727096204.51172: handler run complete 13355 1727096204.51278: attempt loop complete, returning result 13355 1727096204.51306: _execute() done 13355 1727096204.51309: dumping result to json 13355 1727096204.51312: done dumping result, returning 13355 1727096204.51329: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0afff68d-5257-c514-593f-00000000017f] 13355 1727096204.51332: sending task result for task 0afff68d-5257-c514-593f-00000000017f 13355 1727096204.51427: done sending task result for task 0afff68d-5257-c514-593f-00000000017f 13355 1727096204.51430: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13355 1727096204.51531: no more pending results, returning what we have 13355 1727096204.51534: results queue empty 13355 1727096204.51535: checking for any_errors_fatal 13355 1727096204.51544: done checking for any_errors_fatal 13355 1727096204.51545: checking for max_fail_percentage 13355 1727096204.51547: done checking for max_fail_percentage 13355 1727096204.51548: checking to see if all hosts have failed and the running result is not ok 13355 1727096204.51548: done checking to see if all hosts have failed 13355 1727096204.51549: getting the remaining hosts for this loop 13355 1727096204.51550: done getting the remaining hosts for this loop 13355 1727096204.51553: getting the next task for host managed_node3 13355 1727096204.51561: done getting next task for host managed_node3 13355 1727096204.51566: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096204.51571: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096204.51585: getting variables 13355 1727096204.51586: in VariableManager get_vars() 13355 1727096204.51639: Calling all_inventory to load vars for managed_node3 13355 1727096204.51642: Calling groups_inventory to load vars for managed_node3 13355 1727096204.51645: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096204.51656: Calling all_plugins_play to load vars for managed_node3 13355 1727096204.51659: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096204.51663: Calling groups_plugins_play to load vars for managed_node3 13355 1727096204.53368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096204.56528: done with get_vars() 13355 1727096204.56574: done getting variables 13355 1727096204.56671: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Monday 23 September 2024 08:56:44 -0400 (0:00:00.074) 0:00:53.827 ****** 13355 1727096204.56717: entering _queue_task() for managed_node3/debug 13355 1727096204.57123: worker is 1 (out of 1 available) 13355 1727096204.57137: exiting _queue_task() for managed_node3/debug 13355 1727096204.57267: done queuing things up, now waiting for results queue to drain 13355 1727096204.57271: waiting for pending results... 13355 1727096204.57594: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13355 1727096204.57646: in run() - task 0afff68d-5257-c514-593f-000000000180 13355 1727096204.57674: variable 'ansible_search_path' from source: unknown 13355 1727096204.57695: variable 'ansible_search_path' from source: unknown 13355 1727096204.57740: calling self._execute() 13355 1727096204.57874: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.57879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.57886: variable 'omit' from source: magic vars 13355 1727096204.58449: variable 'ansible_distribution_major_version' from source: facts 13355 1727096204.58454: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096204.58558: variable 'network_state' from source: role '' defaults 13355 1727096204.58562: Evaluated conditional (network_state != {}): False 13355 1727096204.58566: when evaluation is False, skipping this task 13355 1727096204.58571: _execute() done 13355 1727096204.58574: dumping result to json 13355 1727096204.58577: done dumping result, returning 13355 1727096204.58579: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0afff68d-5257-c514-593f-000000000180] 13355 1727096204.58582: sending task result for task 0afff68d-5257-c514-593f-000000000180 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 13355 1727096204.58935: no more pending results, returning what we have 13355 1727096204.58944: results queue empty 13355 1727096204.58945: checking for any_errors_fatal 13355 1727096204.58955: done checking for any_errors_fatal 13355 1727096204.58956: checking for max_fail_percentage 13355 1727096204.58958: done checking for max_fail_percentage 13355 1727096204.58959: checking to see if all hosts have failed and the running result is not ok 13355 1727096204.58960: done checking to see if all hosts have failed 13355 1727096204.58961: getting the remaining hosts for this loop 13355 1727096204.58962: done getting the remaining hosts for this loop 13355 1727096204.58967: getting the next task for host managed_node3 13355 1727096204.58977: done getting next task for host managed_node3 13355 1727096204.58982: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096204.59071: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096204.59105: getting variables 13355 1727096204.59108: in VariableManager get_vars() 13355 1727096204.59271: Calling all_inventory to load vars for managed_node3 13355 1727096204.59276: Calling groups_inventory to load vars for managed_node3 13355 1727096204.59279: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096204.59290: Calling all_plugins_play to load vars for managed_node3 13355 1727096204.59293: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096204.59296: Calling groups_plugins_play to load vars for managed_node3 13355 1727096204.59832: done sending task result for task 0afff68d-5257-c514-593f-000000000180 13355 1727096204.59837: WORKER PROCESS EXITING 13355 1727096204.60816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096204.64032: done with get_vars() 13355 1727096204.64236: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Monday 23 September 2024 08:56:44 -0400 (0:00:00.077) 0:00:53.905 ****** 13355 1727096204.64457: entering _queue_task() for managed_node3/ping 13355 1727096204.65091: worker is 1 (out of 1 available) 13355 1727096204.65104: exiting _queue_task() for managed_node3/ping 13355 1727096204.65342: done queuing things up, now waiting for results queue to drain 13355 1727096204.65344: waiting for pending results... 13355 1727096204.65542: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 13355 1727096204.65877: in run() - task 0afff68d-5257-c514-593f-000000000181 13355 1727096204.66095: variable 'ansible_search_path' from source: unknown 13355 1727096204.66099: variable 'ansible_search_path' from source: unknown 13355 1727096204.66102: calling self._execute() 13355 1727096204.66259: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.66269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.66386: variable 'omit' from source: magic vars 13355 1727096204.67059: variable 'ansible_distribution_major_version' from source: facts 13355 1727096204.67075: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096204.67112: variable 'omit' from source: magic vars 13355 1727096204.67196: variable 'omit' from source: magic vars 13355 1727096204.67243: variable 'omit' from source: magic vars 13355 1727096204.67304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096204.67348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096204.67382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096204.67413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096204.67431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096204.67475: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096204.67484: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.67492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.67607: Set connection var ansible_shell_executable to /bin/sh 13355 1727096204.67629: Set connection var ansible_shell_type to sh 13355 1727096204.67639: Set connection var ansible_pipelining to False 13355 1727096204.67648: Set connection var ansible_connection to ssh 13355 1727096204.67658: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096204.67669: Set connection var ansible_timeout to 10 13355 1727096204.67705: variable 'ansible_shell_executable' from source: unknown 13355 1727096204.67713: variable 'ansible_connection' from source: unknown 13355 1727096204.67726: variable 'ansible_module_compression' from source: unknown 13355 1727096204.67836: variable 'ansible_shell_type' from source: unknown 13355 1727096204.67840: variable 'ansible_shell_executable' from source: unknown 13355 1727096204.67842: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096204.67844: variable 'ansible_pipelining' from source: unknown 13355 1727096204.67846: variable 'ansible_timeout' from source: unknown 13355 1727096204.67849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096204.67998: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13355 1727096204.68017: variable 'omit' from source: magic vars 13355 1727096204.68028: starting attempt loop 13355 1727096204.68035: running the handler 13355 1727096204.68063: _low_level_execute_command(): starting 13355 1727096204.68082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096204.68892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.68941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096204.68956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096204.68987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096204.69057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096204.70811: stdout chunk (state=3): >>>/root <<< 13355 1727096204.70862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096204.71076: stderr chunk (state=3): >>><<< 13355 1727096204.71079: stdout chunk (state=3): >>><<< 13355 1727096204.71082: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096204.71085: _low_level_execute_command(): starting 13355 1727096204.71088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605 `" && echo ansible-tmp-1727096204.709376-15704-39542172460605="` echo /root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605 `" ) && sleep 0' 13355 1727096204.71605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096204.71622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096204.71637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096204.71649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096204.71662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096204.71670: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096204.71690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.71702: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096204.71737: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.71800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096204.71814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096204.71823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096204.71896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096204.73902: stdout chunk (state=3): >>>ansible-tmp-1727096204.709376-15704-39542172460605=/root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605 <<< 13355 1727096204.74009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096204.74042: stderr chunk (state=3): >>><<< 13355 1727096204.74045: stdout chunk (state=3): >>><<< 13355 1727096204.74063: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096204.709376-15704-39542172460605=/root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096204.74109: variable 'ansible_module_compression' from source: unknown 13355 1727096204.74145: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13355 1727096204.74179: variable 'ansible_facts' from source: unknown 13355 1727096204.74239: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/AnsiballZ_ping.py 13355 1727096204.74346: Sending initial data 13355 1727096204.74349: Sent initial data (151 bytes) 13355 1727096204.74918: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096204.74966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096204.75035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096204.76670: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096204.76701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096204.76732: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp3qfjbsfd /root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/AnsiballZ_ping.py <<< 13355 1727096204.76746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/AnsiballZ_ping.py" <<< 13355 1727096204.76765: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp3qfjbsfd" to remote "/root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/AnsiballZ_ping.py" <<< 13355 1727096204.76774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/AnsiballZ_ping.py" <<< 13355 1727096204.77265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096204.77314: stderr chunk (state=3): >>><<< 13355 1727096204.77317: stdout chunk (state=3): >>><<< 13355 1727096204.77336: done transferring module to remote 13355 1727096204.77345: _low_level_execute_command(): starting 13355 1727096204.77350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/ /root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/AnsiballZ_ping.py && sleep 0' 13355 1727096204.77811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096204.77815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096204.77817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.77819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096204.77822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.77873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096204.77886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096204.77919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096204.79843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096204.79975: stderr chunk (state=3): >>><<< 13355 1727096204.79978: stdout chunk (state=3): >>><<< 13355 1727096204.79981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096204.79984: _low_level_execute_command(): starting 13355 1727096204.79987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/AnsiballZ_ping.py && sleep 0' 13355 1727096204.81328: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096204.81333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.81340: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096204.81343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.81577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096204.81635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096204.96999: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13355 1727096204.98581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096204.98586: stdout chunk (state=3): >>><<< 13355 1727096204.98588: stderr chunk (state=3): >>><<< 13355 1727096204.98592: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096204.98594: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096204.98596: _low_level_execute_command(): starting 13355 1727096204.98598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096204.709376-15704-39542172460605/ > /dev/null 2>&1 && sleep 0' 13355 1727096204.99235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096204.99363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096204.99386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096204.99408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096204.99421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096204.99427: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096204.99483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.99486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096204.99489: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096204.99491: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096204.99494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096204.99496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096204.99498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096204.99777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096204.99781: stderr chunk (state=3): >>>debug2: match found <<< 13355 1727096204.99783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096204.99785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096204.99791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096204.99905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096204.99965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.01904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.01909: stdout chunk (state=3): >>><<< 13355 1727096205.01911: stderr chunk (state=3): >>><<< 13355 1727096205.02076: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.02080: handler run complete 13355 1727096205.02086: attempt loop complete, returning result 13355 1727096205.02088: _execute() done 13355 1727096205.02090: dumping result to json 13355 1727096205.02091: done dumping result, returning 13355 1727096205.02093: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0afff68d-5257-c514-593f-000000000181] 13355 1727096205.02095: sending task result for task 0afff68d-5257-c514-593f-000000000181 13355 1727096205.02160: done sending task result for task 0afff68d-5257-c514-593f-000000000181 13355 1727096205.02162: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 13355 1727096205.02259: no more pending results, returning what we have 13355 1727096205.02263: results queue empty 13355 1727096205.02264: checking for any_errors_fatal 13355 1727096205.02273: done checking for any_errors_fatal 13355 1727096205.02273: checking for max_fail_percentage 13355 1727096205.02275: done checking for max_fail_percentage 13355 1727096205.02276: checking to see if all hosts have failed and the running result is not ok 13355 1727096205.02277: done checking to see if all hosts have failed 13355 1727096205.02277: getting the remaining hosts for this loop 13355 1727096205.02279: done getting the remaining hosts for this loop 13355 1727096205.02282: getting the next task for host managed_node3 13355 1727096205.02301: done getting next task for host managed_node3 13355 1727096205.02303: ^ task is: TASK: meta (role_complete) 13355 1727096205.02307: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096205.02321: getting variables 13355 1727096205.02322: in VariableManager get_vars() 13355 1727096205.02481: Calling all_inventory to load vars for managed_node3 13355 1727096205.02484: Calling groups_inventory to load vars for managed_node3 13355 1727096205.02486: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096205.02496: Calling all_plugins_play to load vars for managed_node3 13355 1727096205.02499: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096205.02502: Calling groups_plugins_play to load vars for managed_node3 13355 1727096205.04113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096205.05678: done with get_vars() 13355 1727096205.05710: done getting variables 13355 1727096205.05809: done queuing things up, now waiting for results queue to drain 13355 1727096205.05811: results queue empty 13355 1727096205.05812: checking for any_errors_fatal 13355 1727096205.05815: done checking for any_errors_fatal 13355 1727096205.05816: checking for max_fail_percentage 13355 1727096205.05817: done checking for max_fail_percentage 13355 1727096205.05818: checking to see if all hosts have failed and the running result is not ok 13355 1727096205.05818: done checking to see if all hosts have failed 13355 1727096205.05819: getting the remaining hosts for this loop 13355 1727096205.05820: done getting the remaining hosts for this loop 13355 1727096205.05822: getting the next task for host managed_node3 13355 1727096205.05826: done getting next task for host managed_node3 13355 1727096205.05828: ^ task is: TASK: Delete the device '{{ controller_device }}' 13355 1727096205.05830: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096205.05834: getting variables 13355 1727096205.05835: in VariableManager get_vars() 13355 1727096205.05856: Calling all_inventory to load vars for managed_node3 13355 1727096205.05859: Calling groups_inventory to load vars for managed_node3 13355 1727096205.05861: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096205.05866: Calling all_plugins_play to load vars for managed_node3 13355 1727096205.05871: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096205.05875: Calling groups_plugins_play to load vars for managed_node3 13355 1727096205.07012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096205.11806: done with get_vars() 13355 1727096205.11827: done getting variables 13355 1727096205.11862: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13355 1727096205.11939: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:242 Monday 23 September 2024 08:56:45 -0400 (0:00:00.475) 0:00:54.380 ****** 13355 1727096205.11960: entering _queue_task() for managed_node3/command 13355 1727096205.12274: worker is 1 (out of 1 available) 13355 1727096205.12289: exiting _queue_task() for managed_node3/command 13355 1727096205.12301: done queuing things up, now waiting for results queue to drain 13355 1727096205.12303: waiting for pending results... 13355 1727096205.12495: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 13355 1727096205.12581: in run() - task 0afff68d-5257-c514-593f-0000000001b1 13355 1727096205.12594: variable 'ansible_search_path' from source: unknown 13355 1727096205.12633: calling self._execute() 13355 1727096205.12718: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.12724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.12734: variable 'omit' from source: magic vars 13355 1727096205.13018: variable 'ansible_distribution_major_version' from source: facts 13355 1727096205.13028: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096205.13034: variable 'omit' from source: magic vars 13355 1727096205.13053: variable 'omit' from source: magic vars 13355 1727096205.13126: variable 'controller_device' from source: play vars 13355 1727096205.13140: variable 'omit' from source: magic vars 13355 1727096205.13179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096205.13206: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096205.13222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096205.13236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096205.13246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096205.13274: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096205.13277: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.13279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.13347: Set connection var ansible_shell_executable to /bin/sh 13355 1727096205.13351: Set connection var ansible_shell_type to sh 13355 1727096205.13360: Set connection var ansible_pipelining to False 13355 1727096205.13365: Set connection var ansible_connection to ssh 13355 1727096205.13369: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096205.13372: Set connection var ansible_timeout to 10 13355 1727096205.13393: variable 'ansible_shell_executable' from source: unknown 13355 1727096205.13398: variable 'ansible_connection' from source: unknown 13355 1727096205.13401: variable 'ansible_module_compression' from source: unknown 13355 1727096205.13404: variable 'ansible_shell_type' from source: unknown 13355 1727096205.13406: variable 'ansible_shell_executable' from source: unknown 13355 1727096205.13409: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.13411: variable 'ansible_pipelining' from source: unknown 13355 1727096205.13413: variable 'ansible_timeout' from source: unknown 13355 1727096205.13415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.13518: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096205.13529: variable 'omit' from source: magic vars 13355 1727096205.13534: starting attempt loop 13355 1727096205.13538: running the handler 13355 1727096205.13553: _low_level_execute_command(): starting 13355 1727096205.13561: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096205.14070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.14106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096205.14109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096205.14113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096205.14116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.14156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.14165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096205.14181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.14234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.15903: stdout chunk (state=3): >>>/root <<< 13355 1727096205.15994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.16025: stderr chunk (state=3): >>><<< 13355 1727096205.16029: stdout chunk (state=3): >>><<< 13355 1727096205.16054: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.16070: _low_level_execute_command(): starting 13355 1727096205.16077: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729 `" && echo ansible-tmp-1727096205.1605399-15737-23183584936729="` echo /root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729 `" ) && sleep 0' 13355 1727096205.16528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.16532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096205.16567: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.16582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.16584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.16629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.16634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096205.16636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.16681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.18669: stdout chunk (state=3): >>>ansible-tmp-1727096205.1605399-15737-23183584936729=/root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729 <<< 13355 1727096205.18766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.18800: stderr chunk (state=3): >>><<< 13355 1727096205.18803: stdout chunk (state=3): >>><<< 13355 1727096205.18820: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096205.1605399-15737-23183584936729=/root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.18848: variable 'ansible_module_compression' from source: unknown 13355 1727096205.18894: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096205.18929: variable 'ansible_facts' from source: unknown 13355 1727096205.18985: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/AnsiballZ_command.py 13355 1727096205.19095: Sending initial data 13355 1727096205.19098: Sent initial data (155 bytes) 13355 1727096205.19535: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.19576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096205.19580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096205.19582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.19584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096205.19586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.19631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.19634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096205.19636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.19680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.21328: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096205.21349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096205.21383: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp__nn8_sj /root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/AnsiballZ_command.py <<< 13355 1727096205.21395: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/AnsiballZ_command.py" <<< 13355 1727096205.21414: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp__nn8_sj" to remote "/root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/AnsiballZ_command.py" <<< 13355 1727096205.21422: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/AnsiballZ_command.py" <<< 13355 1727096205.21895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.21941: stderr chunk (state=3): >>><<< 13355 1727096205.21944: stdout chunk (state=3): >>><<< 13355 1727096205.21992: done transferring module to remote 13355 1727096205.22002: _low_level_execute_command(): starting 13355 1727096205.22007: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/ /root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/AnsiballZ_command.py && sleep 0' 13355 1727096205.22445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096205.22454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096205.22472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.22491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096205.22499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.22545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.22548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096205.22552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.22585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.24459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.24463: stdout chunk (state=3): >>><<< 13355 1727096205.24466: stderr chunk (state=3): >>><<< 13355 1727096205.24576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.24581: _low_level_execute_command(): starting 13355 1727096205.24583: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/AnsiballZ_command.py && sleep 0' 13355 1727096205.25220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096205.25340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096205.25372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.25466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.49371: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-23 08:56:45.483811", "end": "2024-09-23 08:56:45.490699", "delta": "0:00:00.006888", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096205.50838: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.152 closed. <<< 13355 1727096205.50873: stderr chunk (state=3): >>><<< 13355 1727096205.50877: stdout chunk (state=3): >>><<< 13355 1727096205.50900: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-23 08:56:45.483811", "end": "2024-09-23 08:56:45.490699", "delta": "0:00:00.006888", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.152 closed. 13355 1727096205.50933: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096205.50939: _low_level_execute_command(): starting 13355 1727096205.50944: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096205.1605399-15737-23183584936729/ > /dev/null 2>&1 && sleep 0' 13355 1727096205.51402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.51405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096205.51412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.51414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.51416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096205.51418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.51476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.51480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096205.51483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.51508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.53375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.53404: stderr chunk (state=3): >>><<< 13355 1727096205.53407: stdout chunk (state=3): >>><<< 13355 1727096205.53422: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.53427: handler run complete 13355 1727096205.53446: Evaluated conditional (False): False 13355 1727096205.53449: Evaluated conditional (False): False 13355 1727096205.53461: attempt loop complete, returning result 13355 1727096205.53464: _execute() done 13355 1727096205.53466: dumping result to json 13355 1727096205.53470: done dumping result, returning 13355 1727096205.53477: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [0afff68d-5257-c514-593f-0000000001b1] 13355 1727096205.53482: sending task result for task 0afff68d-5257-c514-593f-0000000001b1 13355 1727096205.53582: done sending task result for task 0afff68d-5257-c514-593f-0000000001b1 13355 1727096205.53585: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.006888", "end": "2024-09-23 08:56:45.490699", "failed_when_result": false, "rc": 1, "start": "2024-09-23 08:56:45.483811" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 13355 1727096205.53646: no more pending results, returning what we have 13355 1727096205.53650: results queue empty 13355 1727096205.53651: checking for any_errors_fatal 13355 1727096205.53653: done checking for any_errors_fatal 13355 1727096205.53653: checking for max_fail_percentage 13355 1727096205.53655: done checking for max_fail_percentage 13355 1727096205.53658: checking to see if all hosts have failed and the running result is not ok 13355 1727096205.53659: done checking to see if all hosts have failed 13355 1727096205.53659: getting the remaining hosts for this loop 13355 1727096205.53661: done getting the remaining hosts for this loop 13355 1727096205.53664: getting the next task for host managed_node3 13355 1727096205.53674: done getting next task for host managed_node3 13355 1727096205.53677: ^ task is: TASK: Remove test interfaces 13355 1727096205.53680: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096205.53686: getting variables 13355 1727096205.53687: in VariableManager get_vars() 13355 1727096205.53739: Calling all_inventory to load vars for managed_node3 13355 1727096205.53742: Calling groups_inventory to load vars for managed_node3 13355 1727096205.53744: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096205.53754: Calling all_plugins_play to load vars for managed_node3 13355 1727096205.53759: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096205.53762: Calling groups_plugins_play to load vars for managed_node3 13355 1727096205.54595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096205.55459: done with get_vars() 13355 1727096205.55478: done getting variables 13355 1727096205.55522: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Monday 23 September 2024 08:56:45 -0400 (0:00:00.435) 0:00:54.816 ****** 13355 1727096205.55546: entering _queue_task() for managed_node3/shell 13355 1727096205.55797: worker is 1 (out of 1 available) 13355 1727096205.55811: exiting _queue_task() for managed_node3/shell 13355 1727096205.55823: done queuing things up, now waiting for results queue to drain 13355 1727096205.55824: waiting for pending results... 13355 1727096205.55992: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 13355 1727096205.56096: in run() - task 0afff68d-5257-c514-593f-0000000001b5 13355 1727096205.56109: variable 'ansible_search_path' from source: unknown 13355 1727096205.56112: variable 'ansible_search_path' from source: unknown 13355 1727096205.56141: calling self._execute() 13355 1727096205.56216: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.56220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.56229: variable 'omit' from source: magic vars 13355 1727096205.56499: variable 'ansible_distribution_major_version' from source: facts 13355 1727096205.56509: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096205.56514: variable 'omit' from source: magic vars 13355 1727096205.56554: variable 'omit' from source: magic vars 13355 1727096205.56672: variable 'dhcp_interface1' from source: play vars 13355 1727096205.56676: variable 'dhcp_interface2' from source: play vars 13355 1727096205.56691: variable 'omit' from source: magic vars 13355 1727096205.56724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096205.56754: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096205.56770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096205.56784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096205.56792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096205.56818: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096205.56821: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.56824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.56893: Set connection var ansible_shell_executable to /bin/sh 13355 1727096205.56896: Set connection var ansible_shell_type to sh 13355 1727096205.56902: Set connection var ansible_pipelining to False 13355 1727096205.56908: Set connection var ansible_connection to ssh 13355 1727096205.56914: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096205.56920: Set connection var ansible_timeout to 10 13355 1727096205.56938: variable 'ansible_shell_executable' from source: unknown 13355 1727096205.56941: variable 'ansible_connection' from source: unknown 13355 1727096205.56944: variable 'ansible_module_compression' from source: unknown 13355 1727096205.56946: variable 'ansible_shell_type' from source: unknown 13355 1727096205.56948: variable 'ansible_shell_executable' from source: unknown 13355 1727096205.56950: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.56953: variable 'ansible_pipelining' from source: unknown 13355 1727096205.56959: variable 'ansible_timeout' from source: unknown 13355 1727096205.56961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.57062: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096205.57071: variable 'omit' from source: magic vars 13355 1727096205.57076: starting attempt loop 13355 1727096205.57079: running the handler 13355 1727096205.57087: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096205.57103: _low_level_execute_command(): starting 13355 1727096205.57109: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096205.57626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.57630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.57633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.57635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.57681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.57687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096205.57699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.57750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.59389: stdout chunk (state=3): >>>/root <<< 13355 1727096205.59484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.59525: stderr chunk (state=3): >>><<< 13355 1727096205.59528: stdout chunk (state=3): >>><<< 13355 1727096205.59545: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.59560: _low_level_execute_command(): starting 13355 1727096205.59564: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566 `" && echo ansible-tmp-1727096205.5954516-15752-117647829582566="` echo /root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566 `" ) && sleep 0' 13355 1727096205.60017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096205.60028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096205.60030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.60033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.60036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.60072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.60085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.60129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.62034: stdout chunk (state=3): >>>ansible-tmp-1727096205.5954516-15752-117647829582566=/root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566 <<< 13355 1727096205.62135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.62163: stderr chunk (state=3): >>><<< 13355 1727096205.62166: stdout chunk (state=3): >>><<< 13355 1727096205.62185: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096205.5954516-15752-117647829582566=/root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.62215: variable 'ansible_module_compression' from source: unknown 13355 1727096205.62258: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096205.62293: variable 'ansible_facts' from source: unknown 13355 1727096205.62351: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/AnsiballZ_command.py 13355 1727096205.62456: Sending initial data 13355 1727096205.62462: Sent initial data (156 bytes) 13355 1727096205.62907: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096205.62911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096205.62913: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.62915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096205.62917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096205.62919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.62965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.62977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.63009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.64547: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13355 1727096205.64558: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096205.64581: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096205.64613: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpci0sxlqw /root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/AnsiballZ_command.py <<< 13355 1727096205.64616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/AnsiballZ_command.py" <<< 13355 1727096205.64644: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpci0sxlqw" to remote "/root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/AnsiballZ_command.py" <<< 13355 1727096205.64649: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/AnsiballZ_command.py" <<< 13355 1727096205.65116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.65159: stderr chunk (state=3): >>><<< 13355 1727096205.65165: stdout chunk (state=3): >>><<< 13355 1727096205.65213: done transferring module to remote 13355 1727096205.65222: _low_level_execute_command(): starting 13355 1727096205.65228: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/ /root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/AnsiballZ_command.py && sleep 0' 13355 1727096205.65662: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.65666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096205.65674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.65676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.65678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.65719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.65722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.65761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.67583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.67588: stdout chunk (state=3): >>><<< 13355 1727096205.67590: stderr chunk (state=3): >>><<< 13355 1727096205.67608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.67618: _low_level_execute_command(): starting 13355 1727096205.67707: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/AnsiballZ_command.py && sleep 0' 13355 1727096205.68265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096205.68281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096205.68299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096205.68425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096205.68447: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.68537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.87888: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-23 08:56:45.835581", "end": "2024-09-23 08:56:45.875725", "delta": "0:00:00.040144", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096205.89699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096205.89913: stderr chunk (state=3): >>><<< 13355 1727096205.89917: stdout chunk (state=3): >>><<< 13355 1727096205.89920: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-23 08:56:45.835581", "end": "2024-09-23 08:56:45.875725", "delta": "0:00:00.040144", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096205.89929: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096205.89932: _low_level_execute_command(): starting 13355 1727096205.89935: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096205.5954516-15752-117647829582566/ > /dev/null 2>&1 && sleep 0' 13355 1727096205.90587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096205.90630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096205.90667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096205.90734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096205.92692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096205.92728: stderr chunk (state=3): >>><<< 13355 1727096205.92738: stdout chunk (state=3): >>><<< 13355 1727096205.92764: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096205.92780: handler run complete 13355 1727096205.92814: Evaluated conditional (False): False 13355 1727096205.92830: attempt loop complete, returning result 13355 1727096205.92837: _execute() done 13355 1727096205.92844: dumping result to json 13355 1727096205.92915: done dumping result, returning 13355 1727096205.92918: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [0afff68d-5257-c514-593f-0000000001b5] 13355 1727096205.92920: sending task result for task 0afff68d-5257-c514-593f-0000000001b5 13355 1727096205.93000: done sending task result for task 0afff68d-5257-c514-593f-0000000001b5 13355 1727096205.93003: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.040144", "end": "2024-09-23 08:56:45.875725", "rc": 0, "start": "2024-09-23 08:56:45.835581" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 13355 1727096205.93079: no more pending results, returning what we have 13355 1727096205.93083: results queue empty 13355 1727096205.93084: checking for any_errors_fatal 13355 1727096205.93096: done checking for any_errors_fatal 13355 1727096205.93097: checking for max_fail_percentage 13355 1727096205.93099: done checking for max_fail_percentage 13355 1727096205.93100: checking to see if all hosts have failed and the running result is not ok 13355 1727096205.93101: done checking to see if all hosts have failed 13355 1727096205.93102: getting the remaining hosts for this loop 13355 1727096205.93103: done getting the remaining hosts for this loop 13355 1727096205.93107: getting the next task for host managed_node3 13355 1727096205.93119: done getting next task for host managed_node3 13355 1727096205.93122: ^ task is: TASK: Stop dnsmasq/radvd services 13355 1727096205.93126: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096205.93131: getting variables 13355 1727096205.93132: in VariableManager get_vars() 13355 1727096205.93397: Calling all_inventory to load vars for managed_node3 13355 1727096205.93400: Calling groups_inventory to load vars for managed_node3 13355 1727096205.93403: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096205.93414: Calling all_plugins_play to load vars for managed_node3 13355 1727096205.93417: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096205.93421: Calling groups_plugins_play to load vars for managed_node3 13355 1727096205.95160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096205.96815: done with get_vars() 13355 1727096205.96850: done getting variables 13355 1727096205.96917: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Monday 23 September 2024 08:56:45 -0400 (0:00:00.414) 0:00:55.230 ****** 13355 1727096205.96951: entering _queue_task() for managed_node3/shell 13355 1727096205.97337: worker is 1 (out of 1 available) 13355 1727096205.97350: exiting _queue_task() for managed_node3/shell 13355 1727096205.97365: done queuing things up, now waiting for results queue to drain 13355 1727096205.97366: waiting for pending results... 13355 1727096205.97791: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 13355 1727096205.97840: in run() - task 0afff68d-5257-c514-593f-0000000001b6 13355 1727096205.97864: variable 'ansible_search_path' from source: unknown 13355 1727096205.97874: variable 'ansible_search_path' from source: unknown 13355 1727096205.98013: calling self._execute() 13355 1727096205.98030: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.98041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.98054: variable 'omit' from source: magic vars 13355 1727096205.98479: variable 'ansible_distribution_major_version' from source: facts 13355 1727096205.98495: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096205.98506: variable 'omit' from source: magic vars 13355 1727096205.98575: variable 'omit' from source: magic vars 13355 1727096205.98616: variable 'omit' from source: magic vars 13355 1727096205.98673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096205.98715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096205.98739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096205.98764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096205.98790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096205.98825: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096205.98833: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.98840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.98994: Set connection var ansible_shell_executable to /bin/sh 13355 1727096205.99000: Set connection var ansible_shell_type to sh 13355 1727096205.99003: Set connection var ansible_pipelining to False 13355 1727096205.99005: Set connection var ansible_connection to ssh 13355 1727096205.99007: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096205.99012: Set connection var ansible_timeout to 10 13355 1727096205.99041: variable 'ansible_shell_executable' from source: unknown 13355 1727096205.99049: variable 'ansible_connection' from source: unknown 13355 1727096205.99071: variable 'ansible_module_compression' from source: unknown 13355 1727096205.99074: variable 'ansible_shell_type' from source: unknown 13355 1727096205.99077: variable 'ansible_shell_executable' from source: unknown 13355 1727096205.99079: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096205.99099: variable 'ansible_pipelining' from source: unknown 13355 1727096205.99102: variable 'ansible_timeout' from source: unknown 13355 1727096205.99109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096205.99319: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096205.99328: variable 'omit' from source: magic vars 13355 1727096205.99330: starting attempt loop 13355 1727096205.99333: running the handler 13355 1727096205.99335: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096205.99338: _low_level_execute_command(): starting 13355 1727096205.99347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096206.00199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.00237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096206.00259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.00290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.00363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.02027: stdout chunk (state=3): >>>/root <<< 13355 1727096206.02186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.02189: stdout chunk (state=3): >>><<< 13355 1727096206.02191: stderr chunk (state=3): >>><<< 13355 1727096206.02274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096206.02278: _low_level_execute_command(): starting 13355 1727096206.02281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786 `" && echo ansible-tmp-1727096206.0221734-15775-272310442022786="` echo /root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786 `" ) && sleep 0' 13355 1727096206.02887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096206.02901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096206.02922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096206.02942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096206.02973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096206.02986: stderr chunk (state=3): >>>debug2: match not found <<< 13355 1727096206.03001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.03085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13355 1727096206.03089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.03142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.03159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.03224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.05147: stdout chunk (state=3): >>>ansible-tmp-1727096206.0221734-15775-272310442022786=/root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786 <<< 13355 1727096206.05295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.05316: stdout chunk (state=3): >>><<< 13355 1727096206.05328: stderr chunk (state=3): >>><<< 13355 1727096206.05352: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096206.0221734-15775-272310442022786=/root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096206.05473: variable 'ansible_module_compression' from source: unknown 13355 1727096206.05476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096206.05507: variable 'ansible_facts' from source: unknown 13355 1727096206.05606: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/AnsiballZ_command.py 13355 1727096206.05794: Sending initial data 13355 1727096206.05797: Sent initial data (156 bytes) 13355 1727096206.06410: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096206.06424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096206.06483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.06555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.06589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.06660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.08242: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096206.08285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096206.08331: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp8o_b0ez9 /root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/AnsiballZ_command.py <<< 13355 1727096206.08341: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/AnsiballZ_command.py" <<< 13355 1727096206.08378: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmp8o_b0ez9" to remote "/root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/AnsiballZ_command.py" <<< 13355 1727096206.09122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.09126: stderr chunk (state=3): >>><<< 13355 1727096206.09172: stdout chunk (state=3): >>><<< 13355 1727096206.09175: done transferring module to remote 13355 1727096206.09181: _low_level_execute_command(): starting 13355 1727096206.09191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/ /root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/AnsiballZ_command.py && sleep 0' 13355 1727096206.09814: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096206.09829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096206.09844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096206.09887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address <<< 13355 1727096206.09904: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13355 1727096206.10000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.10021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.10084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.11919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.11923: stdout chunk (state=3): >>><<< 13355 1727096206.11925: stderr chunk (state=3): >>><<< 13355 1727096206.11940: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096206.11949: _low_level_execute_command(): starting 13355 1727096206.11961: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/AnsiballZ_command.py && sleep 0' 13355 1727096206.12590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096206.12606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096206.12629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096206.12648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096206.12673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096206.12747: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.12786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096206.12804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.12828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.12908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.31422: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-23 08:56:46.281701", "end": "2024-09-23 08:56:46.309520", "delta": "0:00:00.027819", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096206.33324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096206.33328: stdout chunk (state=3): >>><<< 13355 1727096206.33331: stderr chunk (state=3): >>><<< 13355 1727096206.33333: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-23 08:56:46.281701", "end": "2024-09-23 08:56:46.309520", "delta": "0:00:00.027819", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096206.33342: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096206.33344: _low_level_execute_command(): starting 13355 1727096206.33347: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096206.0221734-15775-272310442022786/ > /dev/null 2>&1 && sleep 0' 13355 1727096206.34621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096206.34788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.34858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096206.34884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.34971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.35047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.37003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.37015: stdout chunk (state=3): >>><<< 13355 1727096206.37051: stderr chunk (state=3): >>><<< 13355 1727096206.37355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096206.37359: handler run complete 13355 1727096206.37361: Evaluated conditional (False): False 13355 1727096206.37364: attempt loop complete, returning result 13355 1727096206.37366: _execute() done 13355 1727096206.37370: dumping result to json 13355 1727096206.37372: done dumping result, returning 13355 1727096206.37374: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [0afff68d-5257-c514-593f-0000000001b6] 13355 1727096206.37377: sending task result for task 0afff68d-5257-c514-593f-0000000001b6 ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.027819", "end": "2024-09-23 08:56:46.309520", "rc": 0, "start": "2024-09-23 08:56:46.281701" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 13355 1727096206.37580: no more pending results, returning what we have 13355 1727096206.37584: results queue empty 13355 1727096206.37585: checking for any_errors_fatal 13355 1727096206.37597: done checking for any_errors_fatal 13355 1727096206.37598: checking for max_fail_percentage 13355 1727096206.37600: done checking for max_fail_percentage 13355 1727096206.37601: checking to see if all hosts have failed and the running result is not ok 13355 1727096206.37602: done checking to see if all hosts have failed 13355 1727096206.37603: getting the remaining hosts for this loop 13355 1727096206.37604: done getting the remaining hosts for this loop 13355 1727096206.37608: getting the next task for host managed_node3 13355 1727096206.37618: done getting next task for host managed_node3 13355 1727096206.37622: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 13355 1727096206.37625: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096206.37630: getting variables 13355 1727096206.37632: in VariableManager get_vars() 13355 1727096206.38100: Calling all_inventory to load vars for managed_node3 13355 1727096206.38103: Calling groups_inventory to load vars for managed_node3 13355 1727096206.38106: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096206.38121: Calling all_plugins_play to load vars for managed_node3 13355 1727096206.38125: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096206.38129: Calling groups_plugins_play to load vars for managed_node3 13355 1727096206.38932: done sending task result for task 0afff68d-5257-c514-593f-0000000001b6 13355 1727096206.38942: WORKER PROCESS EXITING 13355 1727096206.40984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096206.44277: done with get_vars() 13355 1727096206.44313: done getting variables 13355 1727096206.44488: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:248 Monday 23 September 2024 08:56:46 -0400 (0:00:00.475) 0:00:55.705 ****** 13355 1727096206.44518: entering _queue_task() for managed_node3/command 13355 1727096206.45275: worker is 1 (out of 1 available) 13355 1727096206.45287: exiting _queue_task() for managed_node3/command 13355 1727096206.45299: done queuing things up, now waiting for results queue to drain 13355 1727096206.45300: waiting for pending results... 13355 1727096206.45759: running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript 13355 1727096206.46024: in run() - task 0afff68d-5257-c514-593f-0000000001b7 13355 1727096206.46044: variable 'ansible_search_path' from source: unknown 13355 1727096206.46156: calling self._execute() 13355 1727096206.46375: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096206.46385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096206.46397: variable 'omit' from source: magic vars 13355 1727096206.47274: variable 'ansible_distribution_major_version' from source: facts 13355 1727096206.47279: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096206.47459: variable 'network_provider' from source: set_fact 13355 1727096206.47540: Evaluated conditional (network_provider == "initscripts"): False 13355 1727096206.47549: when evaluation is False, skipping this task 13355 1727096206.47556: _execute() done 13355 1727096206.47564: dumping result to json 13355 1727096206.47574: done dumping result, returning 13355 1727096206.47586: done running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript [0afff68d-5257-c514-593f-0000000001b7] 13355 1727096206.47596: sending task result for task 0afff68d-5257-c514-593f-0000000001b7 skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13355 1727096206.48124: no more pending results, returning what we have 13355 1727096206.48129: results queue empty 13355 1727096206.48130: checking for any_errors_fatal 13355 1727096206.48142: done checking for any_errors_fatal 13355 1727096206.48143: checking for max_fail_percentage 13355 1727096206.48145: done checking for max_fail_percentage 13355 1727096206.48146: checking to see if all hosts have failed and the running result is not ok 13355 1727096206.48147: done checking to see if all hosts have failed 13355 1727096206.48148: getting the remaining hosts for this loop 13355 1727096206.48149: done getting the remaining hosts for this loop 13355 1727096206.48154: getting the next task for host managed_node3 13355 1727096206.48163: done getting next task for host managed_node3 13355 1727096206.48166: ^ task is: TASK: Verify network state restored to default 13355 1727096206.48172: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096206.48181: getting variables 13355 1727096206.48182: in VariableManager get_vars() 13355 1727096206.48248: Calling all_inventory to load vars for managed_node3 13355 1727096206.48251: Calling groups_inventory to load vars for managed_node3 13355 1727096206.48254: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096206.48383: Calling all_plugins_play to load vars for managed_node3 13355 1727096206.48388: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096206.48395: done sending task result for task 0afff68d-5257-c514-593f-0000000001b7 13355 1727096206.48398: WORKER PROCESS EXITING 13355 1727096206.48403: Calling groups_plugins_play to load vars for managed_node3 13355 1727096206.51617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096206.54425: done with get_vars() 13355 1727096206.54455: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:253 Monday 23 September 2024 08:56:46 -0400 (0:00:00.100) 0:00:55.806 ****** 13355 1727096206.54552: entering _queue_task() for managed_node3/include_tasks 13355 1727096206.55083: worker is 1 (out of 1 available) 13355 1727096206.55095: exiting _queue_task() for managed_node3/include_tasks 13355 1727096206.55107: done queuing things up, now waiting for results queue to drain 13355 1727096206.55109: waiting for pending results... 13355 1727096206.55251: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 13355 1727096206.55393: in run() - task 0afff68d-5257-c514-593f-0000000001b8 13355 1727096206.55415: variable 'ansible_search_path' from source: unknown 13355 1727096206.55464: calling self._execute() 13355 1727096206.55579: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096206.55592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096206.55607: variable 'omit' from source: magic vars 13355 1727096206.56025: variable 'ansible_distribution_major_version' from source: facts 13355 1727096206.56043: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096206.56053: _execute() done 13355 1727096206.56065: dumping result to json 13355 1727096206.56076: done dumping result, returning 13355 1727096206.56086: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [0afff68d-5257-c514-593f-0000000001b8] 13355 1727096206.56104: sending task result for task 0afff68d-5257-c514-593f-0000000001b8 13355 1727096206.56245: no more pending results, returning what we have 13355 1727096206.56251: in VariableManager get_vars() 13355 1727096206.56342: Calling all_inventory to load vars for managed_node3 13355 1727096206.56345: Calling groups_inventory to load vars for managed_node3 13355 1727096206.56348: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096206.56365: Calling all_plugins_play to load vars for managed_node3 13355 1727096206.56371: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096206.56375: Calling groups_plugins_play to load vars for managed_node3 13355 1727096206.56986: done sending task result for task 0afff68d-5257-c514-593f-0000000001b8 13355 1727096206.56990: WORKER PROCESS EXITING 13355 1727096206.58163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096206.59837: done with get_vars() 13355 1727096206.59871: variable 'ansible_search_path' from source: unknown 13355 1727096206.59891: we have included files to process 13355 1727096206.59899: generating all_blocks data 13355 1727096206.59902: done generating all_blocks data 13355 1727096206.59909: processing included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13355 1727096206.59911: loading included file: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13355 1727096206.59914: Loading data from /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13355 1727096206.60344: done processing included file 13355 1727096206.60347: iterating over new_blocks loaded from include file 13355 1727096206.60348: in VariableManager get_vars() 13355 1727096206.60382: done with get_vars() 13355 1727096206.60384: filtering new block on tags 13355 1727096206.60420: done filtering new block on tags 13355 1727096206.60423: done iterating over new_blocks loaded from include file included: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 13355 1727096206.60428: extending task lists for all hosts with included blocks 13355 1727096206.61918: done extending task lists 13355 1727096206.61920: done processing included files 13355 1727096206.61921: results queue empty 13355 1727096206.61922: checking for any_errors_fatal 13355 1727096206.61924: done checking for any_errors_fatal 13355 1727096206.61925: checking for max_fail_percentage 13355 1727096206.61926: done checking for max_fail_percentage 13355 1727096206.61927: checking to see if all hosts have failed and the running result is not ok 13355 1727096206.61928: done checking to see if all hosts have failed 13355 1727096206.61929: getting the remaining hosts for this loop 13355 1727096206.61930: done getting the remaining hosts for this loop 13355 1727096206.61932: getting the next task for host managed_node3 13355 1727096206.61936: done getting next task for host managed_node3 13355 1727096206.61939: ^ task is: TASK: Check routes and DNS 13355 1727096206.61942: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096206.61945: getting variables 13355 1727096206.61946: in VariableManager get_vars() 13355 1727096206.61979: Calling all_inventory to load vars for managed_node3 13355 1727096206.61983: Calling groups_inventory to load vars for managed_node3 13355 1727096206.61985: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096206.61991: Calling all_plugins_play to load vars for managed_node3 13355 1727096206.61994: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096206.61997: Calling groups_plugins_play to load vars for managed_node3 13355 1727096206.63503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096206.66147: done with get_vars() 13355 1727096206.66376: done getting variables 13355 1727096206.66425: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Monday 23 September 2024 08:56:46 -0400 (0:00:00.119) 0:00:55.925 ****** 13355 1727096206.66459: entering _queue_task() for managed_node3/shell 13355 1727096206.67299: worker is 1 (out of 1 available) 13355 1727096206.67313: exiting _queue_task() for managed_node3/shell 13355 1727096206.67331: done queuing things up, now waiting for results queue to drain 13355 1727096206.67374: waiting for pending results... 13355 1727096206.67799: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 13355 1727096206.67960: in run() - task 0afff68d-5257-c514-593f-0000000009f0 13355 1727096206.68038: variable 'ansible_search_path' from source: unknown 13355 1727096206.68042: variable 'ansible_search_path' from source: unknown 13355 1727096206.68046: calling self._execute() 13355 1727096206.68175: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096206.68187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096206.68202: variable 'omit' from source: magic vars 13355 1727096206.68734: variable 'ansible_distribution_major_version' from source: facts 13355 1727096206.68751: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096206.68792: variable 'omit' from source: magic vars 13355 1727096206.68838: variable 'omit' from source: magic vars 13355 1727096206.68890: variable 'omit' from source: magic vars 13355 1727096206.68946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096206.69009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096206.69024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096206.69118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096206.69122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096206.69124: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096206.69126: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096206.69128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096206.69259: Set connection var ansible_shell_executable to /bin/sh 13355 1727096206.69292: Set connection var ansible_shell_type to sh 13355 1727096206.69303: Set connection var ansible_pipelining to False 13355 1727096206.69462: Set connection var ansible_connection to ssh 13355 1727096206.69465: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096206.69469: Set connection var ansible_timeout to 10 13355 1727096206.69578: variable 'ansible_shell_executable' from source: unknown 13355 1727096206.69581: variable 'ansible_connection' from source: unknown 13355 1727096206.69584: variable 'ansible_module_compression' from source: unknown 13355 1727096206.69588: variable 'ansible_shell_type' from source: unknown 13355 1727096206.69590: variable 'ansible_shell_executable' from source: unknown 13355 1727096206.69592: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096206.69594: variable 'ansible_pipelining' from source: unknown 13355 1727096206.69598: variable 'ansible_timeout' from source: unknown 13355 1727096206.69601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096206.70396: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096206.70402: variable 'omit' from source: magic vars 13355 1727096206.70404: starting attempt loop 13355 1727096206.70406: running the handler 13355 1727096206.70409: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096206.70411: _low_level_execute_command(): starting 13355 1727096206.70413: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096206.71421: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096206.71434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096206.71442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096206.71490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096206.71507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.71578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096206.71596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.71617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.71678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.73353: stdout chunk (state=3): >>>/root <<< 13355 1727096206.73473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.73484: stdout chunk (state=3): >>><<< 13355 1727096206.73487: stderr chunk (state=3): >>><<< 13355 1727096206.73511: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096206.73524: _low_level_execute_command(): starting 13355 1727096206.73530: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458 `" && echo ansible-tmp-1727096206.7351074-15804-14900268318458="` echo /root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458 `" ) && sleep 0' 13355 1727096206.73966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096206.73982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096206.74003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096206.74006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.74065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096206.74070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.74073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.74115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.76100: stdout chunk (state=3): >>>ansible-tmp-1727096206.7351074-15804-14900268318458=/root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458 <<< 13355 1727096206.76215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.76245: stderr chunk (state=3): >>><<< 13355 1727096206.76249: stdout chunk (state=3): >>><<< 13355 1727096206.76270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096206.7351074-15804-14900268318458=/root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096206.76303: variable 'ansible_module_compression' from source: unknown 13355 1727096206.76344: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096206.76395: variable 'ansible_facts' from source: unknown 13355 1727096206.76697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/AnsiballZ_command.py 13355 1727096206.76730: Sending initial data 13355 1727096206.76734: Sent initial data (155 bytes) 13355 1727096206.77259: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096206.77280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096206.77291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096206.77383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.77395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096206.77407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.77425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.77485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.79114: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13355 1727096206.79120: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13355 1727096206.79139: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096206.79172: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096206.79203: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpc9wvmook /root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/AnsiballZ_command.py <<< 13355 1727096206.79209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/AnsiballZ_command.py" <<< 13355 1727096206.79236: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmpc9wvmook" to remote "/root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/AnsiballZ_command.py" <<< 13355 1727096206.79239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/AnsiballZ_command.py" <<< 13355 1727096206.79902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.80074: stderr chunk (state=3): >>><<< 13355 1727096206.80077: stdout chunk (state=3): >>><<< 13355 1727096206.80079: done transferring module to remote 13355 1727096206.80081: _low_level_execute_command(): starting 13355 1727096206.80083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/ /root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/AnsiballZ_command.py && sleep 0' 13355 1727096206.80704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096206.80715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096206.80723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096206.80745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.80758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096206.80820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.80835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096206.80864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.80892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.80921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.82745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096206.82774: stderr chunk (state=3): >>><<< 13355 1727096206.82778: stdout chunk (state=3): >>><<< 13355 1727096206.82793: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096206.82798: _low_level_execute_command(): starting 13355 1727096206.82804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/AnsiballZ_command.py && sleep 0' 13355 1727096206.83230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096206.83233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096206.83271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.83274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096206.83277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096206.83279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096206.83331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096206.83334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096206.83341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096206.83385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096206.99899: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3273sec preferred_lft 3273sec\n inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:56:46.987192", "end": "2024-09-23 08:56:46.995926", "delta": "0:00:00.008734", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096207.01647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096207.01651: stdout chunk (state=3): >>><<< 13355 1727096207.01654: stderr chunk (state=3): >>><<< 13355 1727096207.01656: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3273sec preferred_lft 3273sec\n inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-23 08:56:46.987192", "end": "2024-09-23 08:56:46.995926", "delta": "0:00:00.008734", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096207.01670: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096207.01874: _low_level_execute_command(): starting 13355 1727096207.01878: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096206.7351074-15804-14900268318458/ > /dev/null 2>&1 && sleep 0' 13355 1727096207.02582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13355 1727096207.02624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096207.02635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096207.02656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096207.02713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096207.02736: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096207.02747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.02870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096207.02902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096207.02968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096207.04835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096207.04840: stdout chunk (state=3): >>><<< 13355 1727096207.04848: stderr chunk (state=3): >>><<< 13355 1727096207.04892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096207.04895: handler run complete 13355 1727096207.04937: Evaluated conditional (False): False 13355 1727096207.04941: attempt loop complete, returning result 13355 1727096207.04943: _execute() done 13355 1727096207.04945: dumping result to json 13355 1727096207.04947: done dumping result, returning 13355 1727096207.05173: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [0afff68d-5257-c514-593f-0000000009f0] 13355 1727096207.05176: sending task result for task 0afff68d-5257-c514-593f-0000000009f0 13355 1727096207.05250: done sending task result for task 0afff68d-5257-c514-593f-0000000009f0 13355 1727096207.05252: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008734", "end": "2024-09-23 08:56:46.995926", "rc": 0, "start": "2024-09-23 08:56:46.987192" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:df:7b:8e:75 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.152/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3273sec preferred_lft 3273sec inet6 fe80::8ff:dfff:fe7b:8e75/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.152 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.152 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 13355 1727096207.05367: no more pending results, returning what we have 13355 1727096207.05372: results queue empty 13355 1727096207.05373: checking for any_errors_fatal 13355 1727096207.05374: done checking for any_errors_fatal 13355 1727096207.05375: checking for max_fail_percentage 13355 1727096207.05377: done checking for max_fail_percentage 13355 1727096207.05377: checking to see if all hosts have failed and the running result is not ok 13355 1727096207.05378: done checking to see if all hosts have failed 13355 1727096207.05379: getting the remaining hosts for this loop 13355 1727096207.05380: done getting the remaining hosts for this loop 13355 1727096207.05388: getting the next task for host managed_node3 13355 1727096207.05395: done getting next task for host managed_node3 13355 1727096207.05397: ^ task is: TASK: Verify DNS and network connectivity 13355 1727096207.05400: ^ state is: HOST STATE: block=2, task=34, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13355 1727096207.05408: getting variables 13355 1727096207.05411: in VariableManager get_vars() 13355 1727096207.05464: Calling all_inventory to load vars for managed_node3 13355 1727096207.05591: Calling groups_inventory to load vars for managed_node3 13355 1727096207.05596: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096207.05613: Calling all_plugins_play to load vars for managed_node3 13355 1727096207.05617: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096207.05620: Calling groups_plugins_play to load vars for managed_node3 13355 1727096207.07386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096207.08582: done with get_vars() 13355 1727096207.08616: done getting variables 13355 1727096207.08674: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Monday 23 September 2024 08:56:47 -0400 (0:00:00.422) 0:00:56.347 ****** 13355 1727096207.08697: entering _queue_task() for managed_node3/shell 13355 1727096207.09142: worker is 1 (out of 1 available) 13355 1727096207.09155: exiting _queue_task() for managed_node3/shell 13355 1727096207.09170: done queuing things up, now waiting for results queue to drain 13355 1727096207.09174: waiting for pending results... 13355 1727096207.09772: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 13355 1727096207.09872: in run() - task 0afff68d-5257-c514-593f-0000000009f1 13355 1727096207.09887: variable 'ansible_search_path' from source: unknown 13355 1727096207.09891: variable 'ansible_search_path' from source: unknown 13355 1727096207.09930: calling self._execute() 13355 1727096207.10079: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096207.10162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096207.10166: variable 'omit' from source: magic vars 13355 1727096207.10681: variable 'ansible_distribution_major_version' from source: facts 13355 1727096207.10717: Evaluated conditional (ansible_distribution_major_version != '6'): True 13355 1727096207.10851: variable 'ansible_facts' from source: unknown 13355 1727096207.11883: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 13355 1727096207.11888: variable 'omit' from source: magic vars 13355 1727096207.11890: variable 'omit' from source: magic vars 13355 1727096207.11893: variable 'omit' from source: magic vars 13355 1727096207.11916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13355 1727096207.11958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13355 1727096207.11978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13355 1727096207.12014: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096207.12017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13355 1727096207.12049: variable 'inventory_hostname' from source: host vars for 'managed_node3' 13355 1727096207.12052: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096207.12055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096207.12129: Set connection var ansible_shell_executable to /bin/sh 13355 1727096207.12132: Set connection var ansible_shell_type to sh 13355 1727096207.12139: Set connection var ansible_pipelining to False 13355 1727096207.12143: Set connection var ansible_connection to ssh 13355 1727096207.12149: Set connection var ansible_module_compression to ZIP_DEFLATED 13355 1727096207.12154: Set connection var ansible_timeout to 10 13355 1727096207.12178: variable 'ansible_shell_executable' from source: unknown 13355 1727096207.12249: variable 'ansible_connection' from source: unknown 13355 1727096207.12253: variable 'ansible_module_compression' from source: unknown 13355 1727096207.12255: variable 'ansible_shell_type' from source: unknown 13355 1727096207.12261: variable 'ansible_shell_executable' from source: unknown 13355 1727096207.12263: variable 'ansible_host' from source: host vars for 'managed_node3' 13355 1727096207.12265: variable 'ansible_pipelining' from source: unknown 13355 1727096207.12268: variable 'ansible_timeout' from source: unknown 13355 1727096207.12271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 13355 1727096207.12336: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096207.12345: variable 'omit' from source: magic vars 13355 1727096207.12363: starting attempt loop 13355 1727096207.12366: running the handler 13355 1727096207.12371: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13355 1727096207.12401: _low_level_execute_command(): starting 13355 1727096207.12403: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13355 1727096207.12947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096207.12965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found <<< 13355 1727096207.13033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.13042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found <<< 13355 1727096207.13047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.13077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096207.13125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096207.14799: stdout chunk (state=3): >>>/root <<< 13355 1727096207.14896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096207.14928: stderr chunk (state=3): >>><<< 13355 1727096207.14935: stdout chunk (state=3): >>><<< 13355 1727096207.14979: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096207.14992: _low_level_execute_command(): starting 13355 1727096207.14998: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348 `" && echo ansible-tmp-1727096207.1497831-15844-244545871988348="` echo /root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348 `" ) && sleep 0' 13355 1727096207.15672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.15738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096207.15743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096207.15774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096207.17747: stdout chunk (state=3): >>>ansible-tmp-1727096207.1497831-15844-244545871988348=/root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348 <<< 13355 1727096207.17848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096207.17884: stderr chunk (state=3): >>><<< 13355 1727096207.17887: stdout chunk (state=3): >>><<< 13355 1727096207.17904: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727096207.1497831-15844-244545871988348=/root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096207.17932: variable 'ansible_module_compression' from source: unknown 13355 1727096207.17976: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13355c8m5l4ym/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13355 1727096207.18015: variable 'ansible_facts' from source: unknown 13355 1727096207.18072: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/AnsiballZ_command.py 13355 1727096207.18181: Sending initial data 13355 1727096207.18184: Sent initial data (156 bytes) 13355 1727096207.18652: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096207.18656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.18659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration <<< 13355 1727096207.18662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.18713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096207.18716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096207.18718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096207.18763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096207.20358: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13355 1727096207.20388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13355 1727096207.20420: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmppu7o1lij /root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/AnsiballZ_command.py <<< 13355 1727096207.20423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/AnsiballZ_command.py" <<< 13355 1727096207.20449: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13355c8m5l4ym/tmppu7o1lij" to remote "/root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/AnsiballZ_command.py" <<< 13355 1727096207.20934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096207.20980: stderr chunk (state=3): >>><<< 13355 1727096207.20984: stdout chunk (state=3): >>><<< 13355 1727096207.21023: done transferring module to remote 13355 1727096207.21033: _low_level_execute_command(): starting 13355 1727096207.21037: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/ /root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/AnsiballZ_command.py && sleep 0' 13355 1727096207.21470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096207.21506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096207.21509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096207.21512: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.21514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096207.21516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096207.21517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.21571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096207.21574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096207.21593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096207.21611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096207.23428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096207.23450: stderr chunk (state=3): >>><<< 13355 1727096207.23453: stdout chunk (state=3): >>><<< 13355 1727096207.23470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096207.23473: _low_level_execute_command(): starting 13355 1727096207.23477: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/AnsiballZ_command.py && sleep 0' 13355 1727096207.23935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13355 1727096207.23938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13355 1727096207.23940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.23944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13355 1727096207.23946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 <<< 13355 1727096207.23948: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13355 1727096207.24000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' <<< 13355 1727096207.24003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13355 1727096207.24009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096207.24047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096207.56888: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6571 0 --:--:-- --:--:-- --:--:-- 6630\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2722 0 --:--:-- --:--:-- --:--:-- 2745", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:56:47.389131", "end": "2024-09-23 08:56:47.565875", "delta": "0:00:00.176744", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13355 1727096207.58590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. <<< 13355 1727096207.58612: stderr chunk (state=3): >>><<< 13355 1727096207.58615: stdout chunk (state=3): >>><<< 13355 1727096207.58636: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6571 0 --:--:-- --:--:-- --:--:-- 6630\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 2722 0 --:--:-- --:--:-- --:--:-- 2745", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-23 08:56:47.389131", "end": "2024-09-23 08:56:47.565875", "delta": "0:00:00.176744", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.152 closed. 13355 1727096207.58676: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13355 1727096207.58684: _low_level_execute_command(): starting 13355 1727096207.58689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727096207.1497831-15844-244545871988348/ > /dev/null 2>&1 && sleep 0' 13355 1727096207.59349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13355 1727096207.59629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13355 1727096207.61257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13355 1727096207.61288: stderr chunk (state=3): >>><<< 13355 1727096207.61292: stdout chunk (state=3): >>><<< 13355 1727096207.61306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.152 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.152 originally 10.31.14.152 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e9699315b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13355 1727096207.61312: handler run complete 13355 1727096207.61330: Evaluated conditional (False): False 13355 1727096207.61338: attempt loop complete, returning result 13355 1727096207.61341: _execute() done 13355 1727096207.61344: dumping result to json 13355 1727096207.61348: done dumping result, returning 13355 1727096207.61358: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [0afff68d-5257-c514-593f-0000000009f1] 13355 1727096207.61361: sending task result for task 0afff68d-5257-c514-593f-0000000009f1 13355 1727096207.61462: done sending task result for task 0afff68d-5257-c514-593f-0000000009f1 13355 1727096207.61465: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.176744", "end": "2024-09-23 08:56:47.565875", "rc": 0, "start": "2024-09-23 08:56:47.389131" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6571 0 --:--:-- --:--:-- --:--:-- 6630 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 2722 0 --:--:-- --:--:-- --:--:-- 2745 13355 1727096207.61533: no more pending results, returning what we have 13355 1727096207.61537: results queue empty 13355 1727096207.61537: checking for any_errors_fatal 13355 1727096207.61550: done checking for any_errors_fatal 13355 1727096207.61551: checking for max_fail_percentage 13355 1727096207.61553: done checking for max_fail_percentage 13355 1727096207.61553: checking to see if all hosts have failed and the running result is not ok 13355 1727096207.61554: done checking to see if all hosts have failed 13355 1727096207.61554: getting the remaining hosts for this loop 13355 1727096207.61558: done getting the remaining hosts for this loop 13355 1727096207.61562: getting the next task for host managed_node3 13355 1727096207.61574: done getting next task for host managed_node3 13355 1727096207.61578: ^ task is: TASK: meta (flush_handlers) 13355 1727096207.61579: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096207.61589: getting variables 13355 1727096207.61591: in VariableManager get_vars() 13355 1727096207.61642: Calling all_inventory to load vars for managed_node3 13355 1727096207.61644: Calling groups_inventory to load vars for managed_node3 13355 1727096207.61647: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096207.61660: Calling all_plugins_play to load vars for managed_node3 13355 1727096207.61663: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096207.61666: Calling groups_plugins_play to load vars for managed_node3 13355 1727096207.62689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096207.64240: done with get_vars() 13355 1727096207.64274: done getting variables 13355 1727096207.64347: in VariableManager get_vars() 13355 1727096207.64374: Calling all_inventory to load vars for managed_node3 13355 1727096207.64377: Calling groups_inventory to load vars for managed_node3 13355 1727096207.64379: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096207.64384: Calling all_plugins_play to load vars for managed_node3 13355 1727096207.64386: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096207.64388: Calling groups_plugins_play to load vars for managed_node3 13355 1727096207.65661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096207.67200: done with get_vars() 13355 1727096207.67237: done queuing things up, now waiting for results queue to drain 13355 1727096207.67240: results queue empty 13355 1727096207.67241: checking for any_errors_fatal 13355 1727096207.67244: done checking for any_errors_fatal 13355 1727096207.67245: checking for max_fail_percentage 13355 1727096207.67246: done checking for max_fail_percentage 13355 1727096207.67247: checking to see if all hosts have failed and the running result is not ok 13355 1727096207.67248: done checking to see if all hosts have failed 13355 1727096207.67249: getting the remaining hosts for this loop 13355 1727096207.67250: done getting the remaining hosts for this loop 13355 1727096207.67253: getting the next task for host managed_node3 13355 1727096207.67260: done getting next task for host managed_node3 13355 1727096207.67262: ^ task is: TASK: meta (flush_handlers) 13355 1727096207.67263: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096207.67266: getting variables 13355 1727096207.67269: in VariableManager get_vars() 13355 1727096207.67292: Calling all_inventory to load vars for managed_node3 13355 1727096207.67295: Calling groups_inventory to load vars for managed_node3 13355 1727096207.67297: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096207.67303: Calling all_plugins_play to load vars for managed_node3 13355 1727096207.67305: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096207.67308: Calling groups_plugins_play to load vars for managed_node3 13355 1727096207.68460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096207.69923: done with get_vars() 13355 1727096207.69953: done getting variables 13355 1727096207.70019: in VariableManager get_vars() 13355 1727096207.70043: Calling all_inventory to load vars for managed_node3 13355 1727096207.70046: Calling groups_inventory to load vars for managed_node3 13355 1727096207.70048: Calling all_plugins_inventory to load vars for managed_node3 13355 1727096207.70053: Calling all_plugins_play to load vars for managed_node3 13355 1727096207.70055: Calling groups_plugins_inventory to load vars for managed_node3 13355 1727096207.70058: Calling groups_plugins_play to load vars for managed_node3 13355 1727096207.71342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13355 1727096207.73881: done with get_vars() 13355 1727096207.73932: done queuing things up, now waiting for results queue to drain 13355 1727096207.73935: results queue empty 13355 1727096207.73936: checking for any_errors_fatal 13355 1727096207.73937: done checking for any_errors_fatal 13355 1727096207.73938: checking for max_fail_percentage 13355 1727096207.73939: done checking for max_fail_percentage 13355 1727096207.73940: checking to see if all hosts have failed and the running result is not ok 13355 1727096207.73940: done checking to see if all hosts have failed 13355 1727096207.73941: getting the remaining hosts for this loop 13355 1727096207.73942: done getting the remaining hosts for this loop 13355 1727096207.73945: getting the next task for host managed_node3 13355 1727096207.73949: done getting next task for host managed_node3 13355 1727096207.73950: ^ task is: None 13355 1727096207.73952: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13355 1727096207.73953: done queuing things up, now waiting for results queue to drain 13355 1727096207.73954: results queue empty 13355 1727096207.73955: checking for any_errors_fatal 13355 1727096207.73955: done checking for any_errors_fatal 13355 1727096207.73956: checking for max_fail_percentage 13355 1727096207.73957: done checking for max_fail_percentage 13355 1727096207.73957: checking to see if all hosts have failed and the running result is not ok 13355 1727096207.73958: done checking to see if all hosts have failed 13355 1727096207.73961: getting the next task for host managed_node3 13355 1727096207.73964: done getting next task for host managed_node3 13355 1727096207.73965: ^ task is: None 13355 1727096207.73966: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=109 changed=5 unreachable=0 failed=0 skipped=120 rescued=0 ignored=0 Monday 23 September 2024 08:56:47 -0400 (0:00:00.653) 0:00:57.001 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.25s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.12s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.03s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.99s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.80s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.62s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_removal_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.33s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install dnsmasq --------------------------------------------------------- 1.26s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.26s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.18s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_removal.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.03s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.98s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.89s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.85s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.85s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.82s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.80s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.79s /tmp/collections-And/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 13355 1727096207.74159: RUNNING CLEANUP